sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
268956e3970e2fdfff0f05e3c31b01f934b78436
|
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-sft-v8-7k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T22:42:11.722457](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps/blob/main/results_2023-10-15T22-42-11.722457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00041946308724832214,
"em_stderr": 0.000209698547078268,
"f1": 0.04836619127516802,
"f1_stderr": 0.0011660409478930682,
"acc": 0.3794319917806236,
"acc_stderr": 0.010932628099092904
},
"harness|drop|3": {
"em": 0.00041946308724832214,
"em_stderr": 0.000209698547078268,
"f1": 0.04836619127516802,
"f1_stderr": 0.0011660409478930682
},
"harness|gsm8k|5": {
"acc": 0.10614101592115238,
"acc_stderr": 0.008484346948434564
},
"harness|winogrande|5": {
"acc": 0.6527229676400947,
"acc_stderr": 0.013380909249751242
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps
|
[
"region:us"
] |
2023-08-18T10:12:44+00:00
|
{"pretty_name": "Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-sft-v8-7k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T22:42:11.722457](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-7k-steps/blob/main/results_2023-10-15T22-42-11.722457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00041946308724832214,\n \"em_stderr\": 0.000209698547078268,\n \"f1\": 0.04836619127516802,\n \"f1_stderr\": 0.0011660409478930682,\n \"acc\": 0.3794319917806236,\n \"acc_stderr\": 0.010932628099092904\n },\n \"harness|drop|3\": {\n \"em\": 0.00041946308724832214,\n \"em_stderr\": 0.000209698547078268,\n \"f1\": 0.04836619127516802,\n \"f1_stderr\": 0.0011660409478930682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10614101592115238,\n \"acc_stderr\": 0.008484346948434564\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6527229676400947,\n \"acc_stderr\": 0.013380909249751242\n }\n}\n```", "repo_url": "https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-7k-steps", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T22_42_11.722457", "path": ["**/details_harness|drop|3_2023-10-15T22-42-11.722457.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T22-42-11.722457.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T22_42_11.722457", "path": ["**/details_harness|gsm8k|5_2023-10-15T22-42-11.722457.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T22-42-11.722457.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:12:25.184971.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:12:25.184971.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T22_42_11.722457", "path": ["**/details_harness|winogrande|5_2023-10-15T22-42-11.722457.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T22-42-11.722457.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_12_25.184971", "path": ["results_2023-07-19T18:12:25.184971.parquet"]}, {"split": "2023_10_15T22_42_11.722457", "path": ["results_2023-10-15T22-42-11.722457.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T22-42-11.722457.parquet"]}]}]}
|
2023-10-15T21:42:23+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-7k-steps on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T22:42:11.722457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-7k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T22:42:11.722457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-7k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T22:42:11.722457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-7k-steps## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-7k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T22:42:11.722457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d83b95ca8b780b4df4059a088e93a8d0d92b8afc
|
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-2.5k-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-2.5k-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-sft-v8-2.5k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-2.5k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-2.5k-steps",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T06:05:53.274569](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-2.5k-steps/blob/main/results_2023-10-19T06-05-53.274569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04918729026845652,
"f1_stderr": 0.0011958323498480873,
"acc": 0.3760981059411563,
"acc_stderr": 0.010720714478256874
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.04918729026845652,
"f1_stderr": 0.0011958323498480873
},
"harness|gsm8k|5": {
"acc": 0.09552691432903715,
"acc_stderr": 0.008096605771155738
},
"harness|winogrande|5": {
"acc": 0.6566692975532754,
"acc_stderr": 0.013344823185358009
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-2.5k-steps
|
[
"region:us"
] |
2023-08-18T10:12:53+00:00
|
{"pretty_name": "Evaluation run of OpenAssistant/pythia-12b-sft-v8-2.5k-steps", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-sft-v8-2.5k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-2.5k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-2.5k-steps\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T06:05:53.274569](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-2.5k-steps/blob/main/results_2023-10-19T06-05-53.274569.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.04918729026845652,\n \"f1_stderr\": 0.0011958323498480873,\n \"acc\": 0.3760981059411563,\n \"acc_stderr\": 0.010720714478256874\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.04918729026845652,\n \"f1_stderr\": 0.0011958323498480873\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09552691432903715,\n \"acc_stderr\": 0.008096605771155738\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6566692975532754,\n \"acc_stderr\": 0.013344823185358009\n }\n}\n```", "repo_url": "https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-2.5k-steps", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T06_05_53.274569", "path": ["**/details_harness|drop|3_2023-10-19T06-05-53.274569.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T06-05-53.274569.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T06_05_53.274569", "path": ["**/details_harness|gsm8k|5_2023-10-19T06-05-53.274569.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T06-05-53.274569.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:14:20.845496.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:14:20.845496.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:14:20.845496.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T06_05_53.274569", "path": ["**/details_harness|winogrande|5_2023-10-19T06-05-53.274569.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T06-05-53.274569.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_14_20.845496", "path": ["results_2023-07-19T18:14:20.845496.parquet"]}, {"split": "2023_10_19T06_05_53.274569", "path": ["results_2023-10-19T06-05-53.274569.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T06-05-53.274569.parquet"]}]}]}
|
2023-10-19T05:06:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-2.5k-steps
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-2.5k-steps on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T06:05:53.274569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-2.5k-steps",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-2.5k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T06:05:53.274569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-2.5k-steps",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-2.5k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T06:05:53.274569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
30,
31,
178,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-2.5k-steps## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-sft-v8-2.5k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T06:05:53.274569(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
59955a7b0f3a978293aecd5398fe4eb07bb9c847
|
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-pre-v8-12.5k-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/pythia-12b-pre-v8-12.5k-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-pre-v8-12.5k-steps](https://huggingface.co/OpenAssistant/pythia-12b-pre-v8-12.5k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__pythia-12b-pre-v8-12.5k-steps_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-06T18:06:29.606076](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-pre-v8-12.5k-steps_public/blob/main/results_2023-11-06T18-06-29.606076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573356703,
"f1": 0.04888842281879203,
"f1_stderr": 0.001200028057735833,
"acc": 0.3646480645630345,
"acc_stderr": 0.010352737065601407
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573356703,
"f1": 0.04888842281879203,
"f1_stderr": 0.001200028057735833
},
"harness|gsm8k|5": {
"acc": 0.07657316148597422,
"acc_stderr": 0.007324564881451574
},
"harness|winogrande|5": {
"acc": 0.6527229676400947,
"acc_stderr": 0.013380909249751239
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_OpenAssistant__pythia-12b-pre-v8-12.5k-steps
|
[
"region:us"
] |
2023-08-18T10:13:02+00:00
|
{"pretty_name": "Evaluation run of OpenAssistant/pythia-12b-pre-v8-12.5k-steps", "dataset_summary": "Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-pre-v8-12.5k-steps](https://huggingface.co/OpenAssistant/pythia-12b-pre-v8-12.5k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__pythia-12b-pre-v8-12.5k-steps_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-06T18:06:29.606076](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-pre-v8-12.5k-steps_public/blob/main/results_2023-11-06T18-06-29.606076.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573356703,\n \"f1\": 0.04888842281879203,\n \"f1_stderr\": 0.001200028057735833,\n \"acc\": 0.3646480645630345,\n \"acc_stderr\": 0.010352737065601407\n },\n \"harness|drop|3\": {\n \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573356703,\n \"f1\": 0.04888842281879203,\n \"f1_stderr\": 0.001200028057735833\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07657316148597422,\n \"acc_stderr\": 0.007324564881451574\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6527229676400947,\n \"acc_stderr\": 0.013380909249751239\n }\n}\n```", "repo_url": "https://huggingface.co/OpenAssistant/pythia-12b-pre-v8-12.5k-steps", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_05T00_35_27.872152", "path": ["**/details_harness|drop|3_2023-11-05T00-35-27.872152.parquet"]}, {"split": "2023_11_06T18_06_29.606076", "path": ["**/details_harness|drop|3_2023-11-06T18-06-29.606076.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-06T18-06-29.606076.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_05T00_35_27.872152", "path": ["**/details_harness|gsm8k|5_2023-11-05T00-35-27.872152.parquet"]}, {"split": "2023_11_06T18_06_29.606076", "path": ["**/details_harness|gsm8k|5_2023-11-06T18-06-29.606076.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-06T18-06-29.606076.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_05T00_35_27.872152", "path": ["**/details_harness|winogrande|5_2023-11-05T00-35-27.872152.parquet"]}, {"split": "2023_11_06T18_06_29.606076", "path": ["**/details_harness|winogrande|5_2023-11-06T18-06-29.606076.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-06T18-06-29.606076.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_05T00_35_27.872152", "path": ["results_2023-11-05T00-35-27.872152.parquet"]}, {"split": "2023_11_06T18_06_29.606076", "path": ["results_2023-11-06T18-06-29.606076.parquet"]}, {"split": "latest", "path": ["results_2023-11-06T18-06-29.606076.parquet"]}]}]}
|
2023-12-01T14:11:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-pre-v8-12.5k-steps
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-pre-v8-12.5k-steps on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-06T18:06:29.606076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-pre-v8-12.5k-steps",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-pre-v8-12.5k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-06T18:06:29.606076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-pre-v8-12.5k-steps",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-pre-v8-12.5k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-06T18:06:29.606076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
178,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-pre-v8-12.5k-steps## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model OpenAssistant/pythia-12b-pre-v8-12.5k-steps on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-06T18:06:29.606076(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ec5583dbcd5de0046b54de14d2229c7adaba8f1d
|
# Dataset Card for Evaluation run of andreaskoepf/llama2-13b-megacode2_min100
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/andreaskoepf/llama2-13b-megacode2_min100
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [andreaskoepf/llama2-13b-megacode2_min100](https://huggingface.co/andreaskoepf/llama2-13b-megacode2_min100) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_andreaskoepf__llama2-13b-megacode2_min100",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:48:34.680007](https://huggingface.co/datasets/open-llm-leaderboard/details_andreaskoepf__llama2-13b-megacode2_min100/blob/main/results_2023-09-22T15-48-34.680007.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753115,
"f1": 0.07890205536912773,
"f1_stderr": 0.0016368809848969982,
"acc": 0.4643729284759866,
"acc_stderr": 0.010956919441194278
},
"harness|drop|3": {
"em": 0.0030411073825503355,
"em_stderr": 0.0005638896908753115,
"f1": 0.07890205536912773,
"f1_stderr": 0.0016368809848969982
},
"harness|gsm8k|5": {
"acc": 0.15921152388172857,
"acc_stderr": 0.010077966717551878
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483668
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_andreaskoepf__llama2-13b-megacode2_min100
|
[
"region:us"
] |
2023-08-18T10:13:10+00:00
|
{"pretty_name": "Evaluation run of andreaskoepf/llama2-13b-megacode2_min100", "dataset_summary": "Dataset automatically created during the evaluation run of model [andreaskoepf/llama2-13b-megacode2_min100](https://huggingface.co/andreaskoepf/llama2-13b-megacode2_min100) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_andreaskoepf__llama2-13b-megacode2_min100\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T15:48:34.680007](https://huggingface.co/datasets/open-llm-leaderboard/details_andreaskoepf__llama2-13b-megacode2_min100/blob/main/results_2023-09-22T15-48-34.680007.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0030411073825503355,\n \"em_stderr\": 0.0005638896908753115,\n \"f1\": 0.07890205536912773,\n \"f1_stderr\": 0.0016368809848969982,\n \"acc\": 0.4643729284759866,\n \"acc_stderr\": 0.010956919441194278\n },\n \"harness|drop|3\": {\n \"em\": 0.0030411073825503355,\n \"em_stderr\": 0.0005638896908753115,\n \"f1\": 0.07890205536912773,\n \"f1_stderr\": 0.0016368809848969982\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15921152388172857,\n \"acc_stderr\": 0.010077966717551878\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483668\n }\n}\n```", "repo_url": "https://huggingface.co/andreaskoepf/llama2-13b-megacode2_min100", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|arc:challenge|25_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T15_48_34.680007", "path": ["**/details_harness|drop|3_2023-09-22T15-48-34.680007.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T15-48-34.680007.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T15_48_34.680007", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-48-34.680007.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-48-34.680007.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hellaswag|10_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T10:46:30.131407.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T10:46:30.131407.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T10:46:30.131407.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T15_48_34.680007", "path": ["**/details_harness|winogrande|5_2023-09-22T15-48-34.680007.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T15-48-34.680007.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T10_46_30.131407", "path": ["results_2023-08-17T10:46:30.131407.parquet"]}, {"split": "2023_09_22T15_48_34.680007", "path": ["results_2023-09-22T15-48-34.680007.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T15-48-34.680007.parquet"]}]}]}
|
2023-09-22T14:48:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of andreaskoepf/llama2-13b-megacode2_min100
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model andreaskoepf/llama2-13b-megacode2_min100 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T15:48:34.680007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of andreaskoepf/llama2-13b-megacode2_min100",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model andreaskoepf/llama2-13b-megacode2_min100 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T15:48:34.680007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of andreaskoepf/llama2-13b-megacode2_min100",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model andreaskoepf/llama2-13b-megacode2_min100 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T15:48:34.680007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of andreaskoepf/llama2-13b-megacode2_min100## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model andreaskoepf/llama2-13b-megacode2_min100 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T15:48:34.680007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
756166d5d41491fe7edd17a5b67da495df7b07a5
|
# Dataset Card for Evaluation run of fireballoon/baichuan-vicuna-chinese-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [fireballoon/baichuan-vicuna-chinese-7b](https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T14:20:11.480532](https://huggingface.co/datasets/open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b/blob/main/results_2023-09-17T14-20-11.480532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.22242030201342283,
"em_stderr": 0.00425891841660003,
"f1": 0.2740048238255038,
"f1_stderr": 0.004278992735739422,
"acc": 0.3619266227972807,
"acc_stderr": 0.00976430949757211
},
"harness|drop|3": {
"em": 0.22242030201342283,
"em_stderr": 0.00425891841660003,
"f1": 0.2740048238255038,
"f1_stderr": 0.004278992735739422
},
"harness|gsm8k|5": {
"acc": 0.05534495830174375,
"acc_stderr": 0.006298221796179566
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.013230397198964655
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b
|
[
"region:us"
] |
2023-08-18T10:13:19+00:00
|
{"pretty_name": "Evaluation run of fireballoon/baichuan-vicuna-chinese-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [fireballoon/baichuan-vicuna-chinese-7b](https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T14:20:11.480532](https://huggingface.co/datasets/open-llm-leaderboard/details_fireballoon__baichuan-vicuna-chinese-7b/blob/main/results_2023-09-17T14-20-11.480532.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22242030201342283,\n \"em_stderr\": 0.00425891841660003,\n \"f1\": 0.2740048238255038,\n \"f1_stderr\": 0.004278992735739422,\n \"acc\": 0.3619266227972807,\n \"acc_stderr\": 0.00976430949757211\n },\n \"harness|drop|3\": {\n \"em\": 0.22242030201342283,\n \"em_stderr\": 0.00425891841660003,\n \"f1\": 0.2740048238255038,\n \"f1_stderr\": 0.004278992735739422\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05534495830174375,\n \"acc_stderr\": 0.006298221796179566\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.013230397198964655\n }\n}\n```", "repo_url": "https://huggingface.co/fireballoon/baichuan-vicuna-chinese-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|arc:challenge|25_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T14_20_11.480532", "path": ["**/details_harness|drop|3_2023-09-17T14-20-11.480532.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T14-20-11.480532.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T14_20_11.480532", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-20-11.480532.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T14-20-11.480532.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hellaswag|10_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T10:02:03.270696.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T10:02:03.270696.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T14_20_11.480532", "path": ["**/details_harness|winogrande|5_2023-09-17T14-20-11.480532.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T14-20-11.480532.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T10_02_03.270696", "path": ["results_2023-08-10T10:02:03.270696.parquet"]}, {"split": "2023_09_17T14_20_11.480532", "path": ["results_2023-09-17T14-20-11.480532.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T14-20-11.480532.parquet"]}]}]}
|
2023-09-17T13:20:23+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of fireballoon/baichuan-vicuna-chinese-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model fireballoon/baichuan-vicuna-chinese-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T14:20:11.480532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of fireballoon/baichuan-vicuna-chinese-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model fireballoon/baichuan-vicuna-chinese-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T14:20:11.480532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of fireballoon/baichuan-vicuna-chinese-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model fireballoon/baichuan-vicuna-chinese-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T14:20:11.480532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of fireballoon/baichuan-vicuna-chinese-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model fireballoon/baichuan-vicuna-chinese-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T14:20:11.480532(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b439fd014f892fe2efda71b4cff858446f20da6a
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-v4](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T19:45:01.546933](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v4/blob/main/results_2023-10-21T19-45-01.546933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968571027,
"f1": 0.05605494966442959,
"f1_stderr": 0.0013169501309663063,
"acc": 0.4076941764856182,
"acc_stderr": 0.009790166925519655
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.00034761798968571027,
"f1": 0.05605494966442959,
"f1_stderr": 0.0013169501309663063
},
"harness|gsm8k|5": {
"acc": 0.07505686125852919,
"acc_stderr": 0.007257633145486643
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v4
|
[
"region:us"
] |
2023-08-18T10:13:27+00:00
|
{"pretty_name": "Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v4", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-v4](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T19:45:01.546933](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v4/blob/main/results_2023-10-21T19-45-01.546933.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571027,\n \"f1\": 0.05605494966442959,\n \"f1_stderr\": 0.0013169501309663063,\n \"acc\": 0.4076941764856182,\n \"acc_stderr\": 0.009790166925519655\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.00034761798968571027,\n \"f1\": 0.05605494966442959,\n \"f1_stderr\": 0.0013169501309663063\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07505686125852919,\n \"acc_stderr\": 0.007257633145486643\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n }\n}\n```", "repo_url": "https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|arc:challenge|25_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T17_33_49.003141", "path": ["**/details_harness|drop|3_2023-10-21T17-33-49.003141.parquet"]}, {"split": "2023_10_21T19_45_01.546933", "path": ["**/details_harness|drop|3_2023-10-21T19-45-01.546933.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T19-45-01.546933.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T17_33_49.003141", "path": ["**/details_harness|gsm8k|5_2023-10-21T17-33-49.003141.parquet"]}, {"split": "2023_10_21T19_45_01.546933", "path": ["**/details_harness|gsm8k|5_2023-10-21T19-45-01.546933.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T19-45-01.546933.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hellaswag|10_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T13:46:44.811067.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T13:46:44.811067.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T13:46:44.811067.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T17_33_49.003141", "path": ["**/details_harness|winogrande|5_2023-10-21T17-33-49.003141.parquet"]}, {"split": "2023_10_21T19_45_01.546933", "path": ["**/details_harness|winogrande|5_2023-10-21T19-45-01.546933.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T19-45-01.546933.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T13_46_44.811067", "path": ["results_2023-08-16T13:46:44.811067.parquet"]}, {"split": "2023_10_21T17_33_49.003141", "path": ["results_2023-10-21T17-33-49.003141.parquet"]}, {"split": "2023_10_21T19_45_01.546933", "path": ["results_2023-10-21T19-45-01.546933.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T19-45-01.546933.parquet"]}]}]}
|
2023-10-21T18:45:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T19:45:01.546933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T19:45:01.546933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T19:45:01.546933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T19:45:01.546933(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9391454212dabfc9e9305fd54d8de13798567d84
|
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-560m-RLHF-v2](https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T18:07:38.079229](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2/blob/main/results_2023-10-21T18-07-38.079229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268527,
"f1": 0.03876782718120811,
"f1_stderr": 0.00113779684793395,
"acc": 0.2549173544570191,
"acc_stderr": 0.007404160104110119
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788268527,
"f1": 0.03876782718120811,
"f1_stderr": 0.00113779684793395
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225266
},
"harness|winogrande|5": {
"acc": 0.5090765588003157,
"acc_stderr": 0.01405017009449771
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2
|
[
"region:us"
] |
2023-08-18T10:13:36+00:00
|
{"pretty_name": "Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-560m-RLHF-v2](https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T18:07:38.079229](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-560m-RLHF-v2/blob/main/results_2023-10-21T18-07-38.079229.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268527,\n \"f1\": 0.03876782718120811,\n \"f1_stderr\": 0.00113779684793395,\n \"acc\": 0.2549173544570191,\n \"acc_stderr\": 0.007404160104110119\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788268527,\n \"f1\": 0.03876782718120811,\n \"f1_stderr\": 0.00113779684793395\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225266\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5090765588003157,\n \"acc_stderr\": 0.01405017009449771\n }\n}\n```", "repo_url": "https://huggingface.co/TheTravellingEngineer/bloom-560m-RLHF-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T18_07_38.079229", "path": ["**/details_harness|drop|3_2023-10-21T18-07-38.079229.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T18-07-38.079229.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T18_07_38.079229", "path": ["**/details_harness|gsm8k|5_2023-10-21T18-07-38.079229.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T18-07-38.079229.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:22:38.044198.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:22:38.044198.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T18_07_38.079229", "path": ["**/details_harness|winogrande|5_2023-10-21T18-07-38.079229.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T18-07-38.079229.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T14_22_38.044198", "path": ["results_2023-08-09T14:22:38.044198.parquet"]}, {"split": "2023_10_21T18_07_38.079229", "path": ["results_2023-10-21T18-07-38.079229.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T18-07-38.079229.parquet"]}]}]}
|
2023-10-21T17:07:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-560m-RLHF-v2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T18:07:38.079229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-560m-RLHF-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T18:07:38.079229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-560m-RLHF-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T18:07:38.079229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-560m-RLHF-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-560m-RLHF-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T18:07:38.079229(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
99ee352e3d5f7655c087db2949cb0f3523916d15
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-guanaco
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-guanaco
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-guanaco](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-guanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T15:24:09.297572](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-guanaco/blob/main/results_2023-09-16T15-24-09.297572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135494218,
"f1": 0.05759857382550368,
"f1_stderr": 0.0013970900427636582,
"acc": 0.4074763654032228,
"acc_stderr": 0.01009856180825454
},
"harness|drop|3": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135494218,
"f1": 0.05759857382550368,
"f1_stderr": 0.0013970900427636582
},
"harness|gsm8k|5": {
"acc": 0.08567096285064443,
"acc_stderr": 0.007709218855882777
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626304
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-guanaco
|
[
"region:us"
] |
2023-08-18T10:13:44+00:00
|
{"pretty_name": "Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-guanaco", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-guanaco](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-guanaco\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T15:24:09.297572](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-guanaco/blob/main/results_2023-09-16T15-24-09.297572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135494218,\n \"f1\": 0.05759857382550368,\n \"f1_stderr\": 0.0013970900427636582,\n \"acc\": 0.4074763654032228,\n \"acc_stderr\": 0.01009856180825454\n },\n \"harness|drop|3\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135494218,\n \"f1\": 0.05759857382550368,\n \"f1_stderr\": 0.0013970900427636582\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08567096285064443,\n \"acc_stderr\": 0.007709218855882777\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626304\n }\n}\n```", "repo_url": "https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-guanaco", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|arc:challenge|25_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T15_24_09.297572", "path": ["**/details_harness|drop|3_2023-09-16T15-24-09.297572.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T15-24-09.297572.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T15_24_09.297572", "path": ["**/details_harness|gsm8k|5_2023-09-16T15-24-09.297572.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T15-24-09.297572.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hellaswag|10_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T15:25:50.809561.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T15:25:50.809561.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T15:25:50.809561.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T15_24_09.297572", "path": ["**/details_harness|winogrande|5_2023-09-16T15-24-09.297572.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T15-24-09.297572.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T15_25_50.809561", "path": ["results_2023-08-02T15:25:50.809561.parquet"]}, {"split": "2023_09_16T15_24_09.297572", "path": ["results_2023-09-16T15-24-09.297572.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T15-24-09.297572.parquet"]}]}]}
|
2023-09-16T14:24:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-guanaco
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-guanaco on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-16T15:24:09.297572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-guanaco",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T15:24:09.297572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-guanaco",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T15:24:09.297572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
30,
31,
178,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-guanaco## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T15:24:09.297572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8f368ada3d6a2757c9c6d16c73f74cf93aa7ebd9
|
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-1b1-RLHF](https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T13:14:43.588399](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF/blob/main/results_2023-12-02T13-14-43.588399.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF
|
[
"region:us"
] |
2023-08-18T10:13:53+00:00
|
{"pretty_name": "Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-1b1-RLHF](https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-02T13:14:43.588399](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF/blob/main/results_2023-12-02T13-14-43.588399.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|arc:challenge|25_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T19_51_27.983287", "path": ["**/details_harness|drop|3_2023-10-16T19-51-27.983287.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T19-51-27.983287.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T19_51_27.983287", "path": ["**/details_harness|gsm8k|5_2023-10-16T19-51-27.983287.parquet"]}, {"split": "2023_12_02T13_14_43.588399", "path": ["**/details_harness|gsm8k|5_2023-12-02T13-14-43.588399.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-02T13-14-43.588399.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hellaswag|10_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T08:38:39.084452.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T08:38:39.084452.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T08:38:39.084452.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T19_51_27.983287", "path": ["**/details_harness|winogrande|5_2023-10-16T19-51-27.983287.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T19-51-27.983287.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T08_38_39.084452", "path": ["results_2023-08-09T08:38:39.084452.parquet"]}, {"split": "2023_10_16T19_51_27.983287", "path": ["results_2023-10-16T19-51-27.983287.parquet"]}, {"split": "2023_12_02T13_14_43.588399", "path": ["results_2023-12-02T13-14-43.588399.parquet"]}, {"split": "latest", "path": ["results_2023-12-02T13-14-43.588399.parquet"]}]}]}
|
2023-12-02T13:14:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-02T13:14:43.588399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T13:14:43.588399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T13:14:43.588399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-02T13:14:43.588399(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b9dd5140c0e1c98546769dbb0636262487952cb0
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-hf-guanaco
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/llama2-7b-hf-guanaco
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-hf-guanaco](https://huggingface.co/TheTravellingEngineer/llama2-7b-hf-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-hf-guanaco",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T16:29:27.182983](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-hf-guanaco/blob/main/results_2023-10-18T16-29-27.182983.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303683565,
"f1": 0.061095847315436325,
"f1_stderr": 0.001395478650607009,
"acc": 0.4012810163878904,
"acc_stderr": 0.009436506107689084
},
"harness|drop|3": {
"em": 0.0020973154362416107,
"em_stderr": 0.00046850650303683565,
"f1": 0.061095847315436325,
"f1_stderr": 0.001395478650607009
},
"harness|gsm8k|5": {
"acc": 0.060652009097801364,
"acc_stderr": 0.006574733381405772
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972397
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-hf-guanaco
|
[
"region:us"
] |
2023-08-18T10:14:02+00:00
|
{"pretty_name": "Evaluation run of TheTravellingEngineer/llama2-7b-hf-guanaco", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-hf-guanaco](https://huggingface.co/TheTravellingEngineer/llama2-7b-hf-guanaco) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-hf-guanaco\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T16:29:27.182983](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-hf-guanaco/blob/main/results_2023-10-18T16-29-27.182983.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303683565,\n \"f1\": 0.061095847315436325,\n \"f1_stderr\": 0.001395478650607009,\n \"acc\": 0.4012810163878904,\n \"acc_stderr\": 0.009436506107689084\n },\n \"harness|drop|3\": {\n \"em\": 0.0020973154362416107,\n \"em_stderr\": 0.00046850650303683565,\n \"f1\": 0.061095847315436325,\n \"f1_stderr\": 0.001395478650607009\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.060652009097801364,\n \"acc_stderr\": 0.006574733381405772\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972397\n }\n}\n```", "repo_url": "https://huggingface.co/TheTravellingEngineer/llama2-7b-hf-guanaco", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|arc:challenge|25_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T16_29_27.182983", "path": ["**/details_harness|drop|3_2023-10-18T16-29-27.182983.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T16-29-27.182983.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T16_29_27.182983", "path": ["**/details_harness|gsm8k|5_2023-10-18T16-29-27.182983.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T16-29-27.182983.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hellaswag|10_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T11:13:14.214419.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T11:13:14.214419.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T11:13:14.214419.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T16_29_27.182983", "path": ["**/details_harness|winogrande|5_2023-10-18T16-29-27.182983.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T16-29-27.182983.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_27T11_13_14.214419", "path": ["results_2023-07-27T11:13:14.214419.parquet"]}, {"split": "2023_10_18T16_29_27.182983", "path": ["results_2023-10-18T16-29-27.182983.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T16-29-27.182983.parquet"]}]}]}
|
2023-10-18T15:29:39+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-hf-guanaco
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-hf-guanaco on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T16:29:27.182983(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-hf-guanaco",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-hf-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T16:29:27.182983(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-hf-guanaco",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-hf-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T16:29:27.182983(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-hf-guanaco## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-hf-guanaco on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T16:29:27.182983(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ea8c1ba9747c68a2862a37d4933d3f17cec9df0e
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-v2](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T20:22:38.137564](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v2/blob/main/results_2023-10-15T20-22-38.137564.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931194434,
"f1": 0.055925964765100665,
"f1_stderr": 0.0013181664771628632,
"acc": 0.4057988012013119,
"acc_stderr": 0.00970458141675358
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931194434,
"f1": 0.055925964765100665,
"f1_stderr": 0.0013181664771628632
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v2
|
[
"region:us"
] |
2023-08-18T10:14:10+00:00
|
{"pretty_name": "Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheTravellingEngineer/llama2-7b-chat-hf-v2](https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T20:22:38.137564](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__llama2-7b-chat-hf-v2/blob/main/results_2023-10-15T20-22-38.137564.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931194434,\n \"f1\": 0.055925964765100665,\n \"f1_stderr\": 0.0013181664771628632,\n \"acc\": 0.4057988012013119,\n \"acc_stderr\": 0.00970458141675358\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931194434,\n \"f1\": 0.055925964765100665,\n \"f1_stderr\": 0.0013181664771628632\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n }\n}\n```", "repo_url": "https://huggingface.co/TheTravellingEngineer/llama2-7b-chat-hf-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|arc:challenge|25_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T20_22_38.137564", "path": ["**/details_harness|drop|3_2023-10-15T20-22-38.137564.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T20-22-38.137564.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T20_22_38.137564", "path": ["**/details_harness|gsm8k|5_2023-10-15T20-22-38.137564.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T20-22-38.137564.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hellaswag|10_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T13:42:39.642131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T13:42:39.642131.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T13:42:39.642131.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T20_22_38.137564", "path": ["**/details_harness|winogrande|5_2023-10-15T20-22-38.137564.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T20-22-38.137564.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T13_42_39.642131", "path": ["results_2023-08-16T13:42:39.642131.parquet"]}, {"split": "2023_10_15T20_22_38.137564", "path": ["results_2023-10-15T20-22-38.137564.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T20-22-38.137564.parquet"]}]}]}
|
2023-10-15T19:22:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T20:22:38.137564(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T20:22:38.137564(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T20:22:38.137564(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheTravellingEngineer/llama2-7b-chat-hf-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/llama2-7b-chat-hf-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T20:22:38.137564(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
348b5baeed5d736091500e6b4ee0f34857ef6d54
|
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-1b1-RLHF-v2](https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF-v2",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T13:43:58.509097](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF-v2/blob/main/results_2023-12-02T13-43-58.509097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF-v2
|
[
"region:us"
] |
2023-08-18T10:14:19+00:00
|
{"pretty_name": "Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheTravellingEngineer/bloom-1b1-RLHF-v2](https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF-v2\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-02T13:43:58.509097](https://huggingface.co/datasets/open-llm-leaderboard/details_TheTravellingEngineer__bloom-1b1-RLHF-v2/blob/main/results_2023-12-02T13-43-58.509097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/TheTravellingEngineer/bloom-1b1-RLHF-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|arc:challenge|25_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T08_04_05.021795", "path": ["**/details_harness|drop|3_2023-10-18T08-04-05.021795.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T08-04-05.021795.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T08_04_05.021795", "path": ["**/details_harness|gsm8k|5_2023-10-18T08-04-05.021795.parquet"]}, {"split": "2023_12_02T13_43_40.813288", "path": ["**/details_harness|gsm8k|5_2023-12-02T13-43-40.813288.parquet"]}, {"split": "2023_12_02T13_43_58.509097", "path": ["**/details_harness|gsm8k|5_2023-12-02T13-43-58.509097.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-02T13-43-58.509097.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hellaswag|10_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T12:59:32.515550.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T12:59:32.515550.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T12:59:32.515550.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T08_04_05.021795", "path": ["**/details_harness|winogrande|5_2023-10-18T08-04-05.021795.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T08-04-05.021795.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T12_59_32.515550", "path": ["results_2023-08-16T12:59:32.515550.parquet"]}, {"split": "2023_10_18T08_04_05.021795", "path": ["results_2023-10-18T08-04-05.021795.parquet"]}, {"split": "2023_12_02T13_43_40.813288", "path": ["results_2023-12-02T13-43-40.813288.parquet"]}, {"split": "2023_12_02T13_43_58.509097", "path": ["results_2023-12-02T13-43-58.509097.parquet"]}, {"split": "latest", "path": ["results_2023-12-02T13-43-58.509097.parquet"]}]}]}
|
2023-12-02T13:44:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF-v2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-02T13:43:58.509097(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T13:43:58.509097(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T13:43:58.509097(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheTravellingEngineer/bloom-1b1-RLHF-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheTravellingEngineer/bloom-1b1-RLHF-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-02T13:43:58.509097(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
4c452b7e01de663706f95fdb5b4f74e93bcd1136
|
# Dataset Card for Evaluation run of huggingface/llama-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggingface/llama-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggingface/llama-30b](https://huggingface.co/huggingface/llama-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggingface__llama-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T02:42:34.429291](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-30b/blob/main/results_2023-10-17T02-42-34.429291.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298701,
"f1": 0.06332634228187943,
"f1_stderr": 0.0013742294190200051,
"acc": 0.47445656434133393,
"acc_stderr": 0.010516415781576863
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298701,
"f1": 0.06332634228187943,
"f1_stderr": 0.0013742294190200051
},
"harness|gsm8k|5": {
"acc": 0.14859742228961334,
"acc_stderr": 0.009797503180527876
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625849
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_huggingface__llama-30b
|
[
"region:us"
] |
2023-08-18T10:14:30+00:00
|
{"pretty_name": "Evaluation run of huggingface/llama-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [huggingface/llama-30b](https://huggingface.co/huggingface/llama-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingface__llama-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T02:42:34.429291](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-30b/blob/main/results_2023-10-17T02-42-34.429291.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298701,\n \"f1\": 0.06332634228187943,\n \"f1_stderr\": 0.0013742294190200051,\n \"acc\": 0.47445656434133393,\n \"acc_stderr\": 0.010516415781576863\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298701,\n \"f1\": 0.06332634228187943,\n \"f1_stderr\": 0.0013742294190200051\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14859742228961334,\n \"acc_stderr\": 0.009797503180527876\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625849\n }\n}\n```", "repo_url": "https://huggingface.co/huggingface/llama-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|arc:challenge|25_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T02_42_34.429291", "path": ["**/details_harness|drop|3_2023-10-17T02-42-34.429291.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T02-42-34.429291.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T02_42_34.429291", "path": ["**/details_harness|gsm8k|5_2023-10-17T02-42-34.429291.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T02-42-34.429291.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hellaswag|10_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T07:25:43.005787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:27:01.742279.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:27:01.742279.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:27:01.742279.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T02_42_34.429291", "path": ["**/details_harness|winogrande|5_2023-10-17T02-42-34.429291.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T02-42-34.429291.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T07_25_43.005787", "path": ["results_2023-07-19T07:25:43.005787.parquet"]}, {"split": "2023_07_19T13_27_01.742279", "path": ["results_2023-07-19T13:27:01.742279.parquet"]}, {"split": "2023_10_17T02_42_34.429291", "path": ["results_2023-10-17T02-42-34.429291.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T02-42-34.429291.parquet"]}]}]}
|
2023-10-17T01:42:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of huggingface/llama-30b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model huggingface/llama-30b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T02:42:34.429291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of huggingface/llama-30b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T02:42:34.429291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of huggingface/llama-30b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T02:42:34.429291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of huggingface/llama-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T02:42:34.429291(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
20c102e6bd9a591496a5833ad3bb3e03f98a5690
|
# Dataset Card for Evaluation run of huggingface/llama-65b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggingface/llama-65b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggingface/llama-65b](https://huggingface.co/huggingface/llama-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggingface__llama-65b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T19:28:15.120039](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-65b/blob/main/results_2023-10-17T19-28-15.120039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902984954,
"f1": 0.05979341442953042,
"f1_stderr": 0.0013171267692059473,
"acc": 0.551148504673065,
"acc_stderr": 0.011494117650229113
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902984954,
"f1": 0.05979341442953042,
"f1_stderr": 0.0013171267692059473
},
"harness|gsm8k|5": {
"acc": 0.2767247915087187,
"acc_stderr": 0.012323047397959794
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_huggingface__llama-65b
|
[
"region:us"
] |
2023-08-18T10:14:46+00:00
|
{"pretty_name": "Evaluation run of huggingface/llama-65b", "dataset_summary": "Dataset automatically created during the evaluation run of model [huggingface/llama-65b](https://huggingface.co/huggingface/llama-65b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingface__llama-65b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T19:28:15.120039](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-65b/blob/main/results_2023-10-17T19-28-15.120039.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902984954,\n \"f1\": 0.05979341442953042,\n \"f1_stderr\": 0.0013171267692059473,\n \"acc\": 0.551148504673065,\n \"acc_stderr\": 0.011494117650229113\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902984954,\n \"f1\": 0.05979341442953042,\n \"f1_stderr\": 0.0013171267692059473\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2767247915087187,\n \"acc_stderr\": 0.012323047397959794\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498433\n }\n}\n```", "repo_url": "https://huggingface.co/huggingface/llama-65b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|arc:challenge|25_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T19_28_15.120039", "path": ["**/details_harness|drop|3_2023-10-17T19-28-15.120039.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T19-28-15.120039.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T19_28_15.120039", "path": ["**/details_harness|gsm8k|5_2023-10-17T19-28-15.120039.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T19-28-15.120039.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hellaswag|10_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-21T14:43:31.065297.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-21T14:43:31.065297.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-21T14:43:31.065297.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T19_28_15.120039", "path": ["**/details_harness|winogrande|5_2023-10-17T19-28-15.120039.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T19-28-15.120039.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_21T14_43_31.065297", "path": ["results_2023-07-21T14:43:31.065297.parquet"]}, {"split": "2023_10_17T19_28_15.120039", "path": ["results_2023-10-17T19-28-15.120039.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T19-28-15.120039.parquet"]}]}]}
|
2023-10-17T18:28:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of huggingface/llama-65b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model huggingface/llama-65b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T19:28:15.120039(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of huggingface/llama-65b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T19:28:15.120039(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of huggingface/llama-65b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T19:28:15.120039(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of huggingface/llama-65b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-65b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T19:28:15.120039(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
29b77583cbaac12b96436a6b96c6dc033fa27a65
|
# Dataset Card for Evaluation run of huggingface/llama-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggingface/llama-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggingface/llama-7b](https://huggingface.co/huggingface/llama-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggingface__llama-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T04:34:28.358522](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-7b/blob/main/results_2023-10-17T04-34-28.358522.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428,
"acc": 0.3749593848153363,
"acc_stderr": 0.008901319861891403
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.00510610785374419
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_huggingface__llama-7b
|
[
"region:us"
] |
2023-08-18T10:14:54+00:00
|
{"pretty_name": "Evaluation run of huggingface/llama-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [huggingface/llama-7b](https://huggingface.co/huggingface/llama-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingface__llama-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T04:34:28.358522](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-7b/blob/main/results_2023-10-17T04-34-28.358522.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219126,\n \"f1\": 0.056186031879194784,\n \"f1_stderr\": 0.0012858243614759428,\n \"acc\": 0.3749593848153363,\n \"acc_stderr\": 0.008901319861891403\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219126,\n \"f1\": 0.056186031879194784,\n \"f1_stderr\": 0.0012858243614759428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \"acc_stderr\": 0.00510610785374419\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038616\n }\n}\n```", "repo_url": "https://huggingface.co/huggingface/llama-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T04_34_28.358522", "path": ["**/details_harness|drop|3_2023-10-17T04-34-28.358522.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T04-34-28.358522.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T04_34_28.358522", "path": ["**/details_harness|gsm8k|5_2023-10-17T04-34-28.358522.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T04-34-28.358522.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T11:42:45.726493.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:42:45.726493.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T11:42:45.726493.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T04_34_28.358522", "path": ["**/details_harness|winogrande|5_2023-10-17T04-34-28.358522.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T04-34-28.358522.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T11_42_45.726493", "path": ["results_2023-07-19T11:42:45.726493.parquet"]}, {"split": "2023_10_17T04_34_28.358522", "path": ["results_2023-10-17T04-34-28.358522.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T04-34-28.358522.parquet"]}]}]}
|
2023-10-17T03:34:40+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of huggingface/llama-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model huggingface/llama-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T04:34:28.358522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of huggingface/llama-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T04:34:28.358522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of huggingface/llama-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T04:34:28.358522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of huggingface/llama-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T04:34:28.358522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
05f0ebadc02a3bdb54878d0e6549a9a2e6ce9410
|
# Dataset Card for Evaluation run of huggingface/llama-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/huggingface/llama-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [huggingface/llama-13b](https://huggingface.co/huggingface/llama-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_huggingface__llama-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T00:11:57.766802](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-13b/blob/main/results_2023-09-23T00-11-57.766802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.056602348993288636,
"f1_stderr": 0.0013004668300984712,
"acc": 0.4191229752993855,
"acc_stderr": 0.009626252314482865
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.000456667646266702,
"f1": 0.056602348993288636,
"f1_stderr": 0.0013004668300984712
},
"harness|gsm8k|5": {
"acc": 0.0758150113722517,
"acc_stderr": 0.007291205723162579
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_huggingface__llama-13b
|
[
"region:us"
] |
2023-08-18T10:15:04+00:00
|
{"pretty_name": "Evaluation run of huggingface/llama-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [huggingface/llama-13b](https://huggingface.co/huggingface/llama-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_huggingface__llama-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T00:11:57.766802](https://huggingface.co/datasets/open-llm-leaderboard/details_huggingface__llama-13b/blob/main/results_2023-09-23T00-11-57.766802.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.000456667646266702,\n \"f1\": 0.056602348993288636,\n \"f1_stderr\": 0.0013004668300984712,\n \"acc\": 0.4191229752993855,\n \"acc_stderr\": 0.009626252314482865\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.000456667646266702,\n \"f1\": 0.056602348993288636,\n \"f1_stderr\": 0.0013004668300984712\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0758150113722517,\n \"acc_stderr\": 0.007291205723162579\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n }\n}\n```", "repo_url": "https://huggingface.co/huggingface/llama-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T00_11_57.766802", "path": ["**/details_harness|drop|3_2023-09-23T00-11-57.766802.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T00-11-57.766802.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T00_11_57.766802", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-11-57.766802.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T00-11-57.766802.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:13.254246.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:00:13.254246.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-20T10:00:13.254246.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T00_11_57.766802", "path": ["**/details_harness|winogrande|5_2023-09-23T00-11-57.766802.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T00-11-57.766802.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_20T10_00_13.254246", "path": ["results_2023-07-20T10:00:13.254246.parquet"]}, {"split": "2023_09_23T00_11_57.766802", "path": ["results_2023-09-23T00-11-57.766802.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T00-11-57.766802.parquet"]}]}]}
|
2023-09-22T23:12:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of huggingface/llama-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model huggingface/llama-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T00:11:57.766802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of huggingface/llama-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T00:11:57.766802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of huggingface/llama-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T00:11:57.766802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of huggingface/llama-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model huggingface/llama-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T00:11:57.766802(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
072e7e3c7698accda2a0866da866de8b7f0ba300
|
# Dataset Card for Evaluation run of openchat/openchat_v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v2](https://huggingface.co/openchat/openchat_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T23:33:59.473281](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2/blob/main/results_2023-10-18T23-33-59.473281.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826953,
"f1": 0.06369546979865812,
"f1_stderr": 0.0013881754743750058,
"acc": 0.4267044764366107,
"acc_stderr": 0.009941310874908384
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.0004191330178826953,
"f1": 0.06369546979865812,
"f1_stderr": 0.0013881754743750058
},
"harness|gsm8k|5": {
"acc": 0.09097801364670205,
"acc_stderr": 0.007921322844013628
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803141
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_openchat__openchat_v2
|
[
"region:us"
] |
2023-08-18T10:15:13+00:00
|
{"pretty_name": "Evaluation run of openchat/openchat_v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [openchat/openchat_v2](https://huggingface.co/openchat/openchat_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T23:33:59.473281](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2/blob/main/results_2023-10-18T23-33-59.473281.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826953,\n \"f1\": 0.06369546979865812,\n \"f1_stderr\": 0.0013881754743750058,\n \"acc\": 0.4267044764366107,\n \"acc_stderr\": 0.009941310874908384\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.0004191330178826953,\n \"f1\": 0.06369546979865812,\n \"f1_stderr\": 0.0013881754743750058\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09097801364670205,\n \"acc_stderr\": 0.007921322844013628\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803141\n }\n}\n```", "repo_url": "https://huggingface.co/openchat/openchat_v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|arc:challenge|25_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T23_33_59.473281", "path": ["**/details_harness|drop|3_2023-10-18T23-33-59.473281.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T23-33-59.473281.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T23_33_59.473281", "path": ["**/details_harness|gsm8k|5_2023-10-18T23-33-59.473281.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T23-33-59.473281.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hellaswag|10_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T16:15:43.375202.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T16:15:43.375202.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T23_33_59.473281", "path": ["**/details_harness|winogrande|5_2023-10-18T23-33-59.473281.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T23-33-59.473281.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T16_15_43.375202", "path": ["results_2023-07-24T16:15:43.375202.parquet"]}, {"split": "2023_10_18T23_33_59.473281", "path": ["results_2023-10-18T23-33-59.473281.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T23-33-59.473281.parquet"]}]}]}
|
2023-10-18T22:34:12+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of openchat/openchat_v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openchat/openchat_v2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T23:33:59.473281(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of openchat/openchat_v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T23:33:59.473281(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openchat/openchat_v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T23:33:59.473281(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openchat/openchat_v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T23:33:59.473281(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
bf8441ca09b7812c4304134be34dc2296163fe56
|
# Dataset Card for Evaluation run of openchat/openchat_v3.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v3.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v3.2](https://huggingface.co/openchat/openchat_v3.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v3.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T16:18:30.810728](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2/blob/main/results_2023-10-19T16-18-30.810728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964610503,
"f1": 0.06215813758389262,
"f1_stderr": 0.001356812104243941,
"acc": 0.4530006767701489,
"acc_stderr": 0.010645807081826102
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964610503,
"f1": 0.06215813758389262,
"f1_stderr": 0.001356812104243941
},
"harness|gsm8k|5": {
"acc": 0.13646702047005307,
"acc_stderr": 0.00945574199881554
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.011835872164836664
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_openchat__openchat_v3.2
|
[
"region:us"
] |
2023-08-18T10:15:22+00:00
|
{"pretty_name": "Evaluation run of openchat/openchat_v3.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [openchat/openchat_v3.2](https://huggingface.co/openchat/openchat_v3.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v3.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T16:18:30.810728](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2/blob/main/results_2023-10-19T16-18-30.810728.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964610503,\n \"f1\": 0.06215813758389262,\n \"f1_stderr\": 0.001356812104243941,\n \"acc\": 0.4530006767701489,\n \"acc_stderr\": 0.010645807081826102\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964610503,\n \"f1\": 0.06215813758389262,\n \"f1_stderr\": 0.001356812104243941\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13646702047005307,\n \"acc_stderr\": 0.00945574199881554\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.011835872164836664\n }\n}\n```", "repo_url": "https://huggingface.co/openchat/openchat_v3.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|arc:challenge|25_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T09_17_54.525414", "path": ["**/details_harness|drop|3_2023-10-17T09-17-54.525414.parquet"]}, {"split": "2023_10_19T16_18_30.810728", "path": ["**/details_harness|drop|3_2023-10-19T16-18-30.810728.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T16-18-30.810728.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T09_17_54.525414", "path": ["**/details_harness|gsm8k|5_2023-10-17T09-17-54.525414.parquet"]}, {"split": "2023_10_19T16_18_30.810728", "path": ["**/details_harness|gsm8k|5_2023-10-19T16-18-30.810728.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T16-18-30.810728.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hellaswag|10_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T17:42:42.050000.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T17:42:42.050000.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T09_17_54.525414", "path": ["**/details_harness|winogrande|5_2023-10-17T09-17-54.525414.parquet"]}, {"split": "2023_10_19T16_18_30.810728", "path": ["**/details_harness|winogrande|5_2023-10-19T16-18-30.810728.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T16-18-30.810728.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T17_42_42.050000", "path": ["results_2023-08-02T17:42:42.050000.parquet"]}, {"split": "2023_10_17T09_17_54.525414", "path": ["results_2023-10-17T09-17-54.525414.parquet"]}, {"split": "2023_10_19T16_18_30.810728", "path": ["results_2023-10-19T16-18-30.810728.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T16-18-30.810728.parquet"]}]}]}
|
2023-10-19T15:18:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of openchat/openchat_v3.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openchat/openchat_v3.2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T16:18:30.810728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of openchat/openchat_v3.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T16:18:30.810728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openchat/openchat_v3.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T16:18:30.810728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openchat/openchat_v3.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T16:18:30.810728(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d95b312e4062e8cf2c2b953b413825b5ff98d9f5
|
# Dataset Card for Evaluation run of openchat/openchat_v2_w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v2_w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v2_w](https://huggingface.co/openchat/openchat_v2_w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v2_w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T10:16:39.894095](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2_w/blob/main/results_2023-10-25T10-16-39.894095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038692,
"f1": 0.06345113255033595,
"f1_stderr": 0.0013770461350277562,
"acc": 0.4217142689595871,
"acc_stderr": 0.009831291629413687
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038692,
"f1": 0.06345113255033595,
"f1_stderr": 0.0013770461350277562
},
"harness|gsm8k|5": {
"acc": 0.0841546626231994,
"acc_stderr": 0.007647024046603207
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224167
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_openchat__openchat_v2_w
|
[
"region:us"
] |
2023-08-18T10:15:31+00:00
|
{"pretty_name": "Evaluation run of openchat/openchat_v2_w", "dataset_summary": "Dataset automatically created during the evaluation run of model [openchat/openchat_v2_w](https://huggingface.co/openchat/openchat_v2_w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v2_w\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T10:16:39.894095](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v2_w/blob/main/results_2023-10-25T10-16-39.894095.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346038692,\n \"f1\": 0.06345113255033595,\n \"f1_stderr\": 0.0013770461350277562,\n \"acc\": 0.4217142689595871,\n \"acc_stderr\": 0.009831291629413687\n },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346038692,\n \"f1\": 0.06345113255033595,\n \"f1_stderr\": 0.0013770461350277562\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0841546626231994,\n \"acc_stderr\": 0.007647024046603207\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224167\n }\n}\n```", "repo_url": "https://huggingface.co/openchat/openchat_v2_w", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|arc:challenge|25_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T04_55_59.182634", "path": ["**/details_harness|drop|3_2023-10-19T04-55-59.182634.parquet"]}, {"split": "2023_10_25T10_16_39.894095", "path": ["**/details_harness|drop|3_2023-10-25T10-16-39.894095.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T10-16-39.894095.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T04_55_59.182634", "path": ["**/details_harness|gsm8k|5_2023-10-19T04-55-59.182634.parquet"]}, {"split": "2023_10_25T10_16_39.894095", "path": ["**/details_harness|gsm8k|5_2023-10-25T10-16-39.894095.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T10-16-39.894095.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hellaswag|10_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T16:07:10.180940.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:10:49.498602.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:10:49.498602.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T04_55_59.182634", "path": ["**/details_harness|winogrande|5_2023-10-19T04-55-59.182634.parquet"]}, {"split": "2023_10_25T10_16_39.894095", "path": ["**/details_harness|winogrande|5_2023-10-25T10-16-39.894095.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T10-16-39.894095.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T16_07_10.180940", "path": ["results_2023-07-24T16:07:10.180940.parquet"]}, {"split": "2023_08_09T10_10_49.498602", "path": ["results_2023-08-09T10:10:49.498602.parquet"]}, {"split": "2023_10_19T04_55_59.182634", "path": ["results_2023-10-19T04-55-59.182634.parquet"]}, {"split": "2023_10_25T10_16_39.894095", "path": ["results_2023-10-25T10-16-39.894095.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T10-16-39.894095.parquet"]}]}]}
|
2023-10-25T09:16:49+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of openchat/openchat_v2_w
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openchat/openchat_v2_w on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-25T10:16:39.894095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of openchat/openchat_v2_w",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v2_w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T10:16:39.894095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openchat/openchat_v2_w",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v2_w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T10:16:39.894095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openchat/openchat_v2_w## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v2_w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T10:16:39.894095(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
524a3b8a7ba254dbb295a990561468a763f536db
|
# Dataset Card for Evaluation run of openchat/openchat_v3.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v3.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v3.1](https://huggingface.co/openchat/openchat_v3.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v3.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T02:39:54.553691](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.1/blob/main/results_2023-10-16T02-39-54.553691.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269345,
"f1": 0.06259228187919454,
"f1_stderr": 0.001365935795409535,
"acc": 0.45020712996200873,
"acc_stderr": 0.010730538116775
},
"harness|drop|3": {
"em": 0.0016778523489932886,
"em_stderr": 0.00041913301788269345,
"f1": 0.06259228187919454,
"f1_stderr": 0.001365935795409535
},
"harness|gsm8k|5": {
"acc": 0.1379833206974981,
"acc_stderr": 0.009499777327746841
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803162
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_openchat__openchat_v3.1
|
[
"region:us"
] |
2023-08-18T10:15:45+00:00
|
{"pretty_name": "Evaluation run of openchat/openchat_v3.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [openchat/openchat_v3.1](https://huggingface.co/openchat/openchat_v3.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v3.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T02:39:54.553691](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.1/blob/main/results_2023-10-16T02-39-54.553691.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788269345,\n \"f1\": 0.06259228187919454,\n \"f1_stderr\": 0.001365935795409535,\n \"acc\": 0.45020712996200873,\n \"acc_stderr\": 0.010730538116775\n },\n \"harness|drop|3\": {\n \"em\": 0.0016778523489932886,\n \"em_stderr\": 0.00041913301788269345,\n \"f1\": 0.06259228187919454,\n \"f1_stderr\": 0.001365935795409535\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1379833206974981,\n \"acc_stderr\": 0.009499777327746841\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803162\n }\n}\n```", "repo_url": "https://huggingface.co/openchat/openchat_v3.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|arc:challenge|25_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_24T04_16_26.631092", "path": ["**/details_harness|drop|3_2023-09-24T04-16-26.631092.parquet"]}, {"split": "2023_10_16T02_39_54.553691", "path": ["**/details_harness|drop|3_2023-10-16T02-39-54.553691.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T02-39-54.553691.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_24T04_16_26.631092", "path": ["**/details_harness|gsm8k|5_2023-09-24T04-16-26.631092.parquet"]}, {"split": "2023_10_16T02_39_54.553691", "path": ["**/details_harness|gsm8k|5_2023-10-16T02-39-54.553691.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T02-39-54.553691.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hellaswag|10_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T17:45:13.943818.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T17:45:13.943818.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T17:45:13.943818.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_24T04_16_26.631092", "path": ["**/details_harness|winogrande|5_2023-09-24T04-16-26.631092.parquet"]}, {"split": "2023_10_16T02_39_54.553691", "path": ["**/details_harness|winogrande|5_2023-10-16T02-39-54.553691.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T02-39-54.553691.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T17_45_13.943818", "path": ["results_2023-08-02T17:45:13.943818.parquet"]}, {"split": "2023_09_24T04_16_26.631092", "path": ["results_2023-09-24T04-16-26.631092.parquet"]}, {"split": "2023_10_16T02_39_54.553691", "path": ["results_2023-10-16T02-39-54.553691.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T02-39-54.553691.parquet"]}]}]}
|
2023-10-16T01:40:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of openchat/openchat_v3.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openchat/openchat_v3.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T02:39:54.553691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of openchat/openchat_v3.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T02:39:54.553691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openchat/openchat_v3.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T02:39:54.553691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openchat/openchat_v3.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T02:39:54.553691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d796c5400ab254ef76a138d130b5dc9bc68f9282
|
# Dataset Card for Evaluation run of openchat/openchat_8192
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_8192
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_8192](https://huggingface.co/openchat/openchat_8192) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_8192",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T17:33:27.764236](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_8192/blob/main/results_2023-10-18T17-33-27.764236.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002936241610738255,
"em_stderr": 0.0005541113054710256,
"f1": 0.061647441275168025,
"f1_stderr": 0.0013798525607833089,
"acc": 0.4116716222677126,
"acc_stderr": 0.009681422698407207
},
"harness|drop|3": {
"em": 0.002936241610738255,
"em_stderr": 0.0005541113054710256,
"f1": 0.061647441275168025,
"f1_stderr": 0.0013798525607833089
},
"harness|gsm8k|5": {
"acc": 0.07354056103108415,
"acc_stderr": 0.007189835754365264
},
"harness|winogrande|5": {
"acc": 0.749802683504341,
"acc_stderr": 0.012173009642449151
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_openchat__openchat_8192
|
[
"region:us"
] |
2023-08-18T10:15:54+00:00
|
{"pretty_name": "Evaluation run of openchat/openchat_8192", "dataset_summary": "Dataset automatically created during the evaluation run of model [openchat/openchat_8192](https://huggingface.co/openchat/openchat_8192) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_8192\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T17:33:27.764236](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_8192/blob/main/results_2023-10-18T17-33-27.764236.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710256,\n \"f1\": 0.061647441275168025,\n \"f1_stderr\": 0.0013798525607833089,\n \"acc\": 0.4116716222677126,\n \"acc_stderr\": 0.009681422698407207\n },\n \"harness|drop|3\": {\n \"em\": 0.002936241610738255,\n \"em_stderr\": 0.0005541113054710256,\n \"f1\": 0.061647441275168025,\n \"f1_stderr\": 0.0013798525607833089\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07354056103108415,\n \"acc_stderr\": 0.007189835754365264\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.749802683504341,\n \"acc_stderr\": 0.012173009642449151\n }\n}\n```", "repo_url": "https://huggingface.co/openchat/openchat_8192", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|arc:challenge|25_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T17_33_27.764236", "path": ["**/details_harness|drop|3_2023-10-18T17-33-27.764236.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T17-33-27.764236.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T17_33_27.764236", "path": ["**/details_harness|gsm8k|5_2023-10-18T17-33-27.764236.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T17-33-27.764236.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hellaswag|10_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T16:57:40.978816.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T16:57:40.978816.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T16:57:40.978816.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T17_33_27.764236", "path": ["**/details_harness|winogrande|5_2023-10-18T17-33-27.764236.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T17-33-27.764236.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T16_57_40.978816", "path": ["results_2023-07-24T16:57:40.978816.parquet"]}, {"split": "2023_10_18T17_33_27.764236", "path": ["results_2023-10-18T17-33-27.764236.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T17-33-27.764236.parquet"]}]}]}
|
2023-10-18T16:33:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of openchat/openchat_8192
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openchat/openchat_8192 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T17:33:27.764236(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of openchat/openchat_8192",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_8192 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T17:33:27.764236(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openchat/openchat_8192",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_8192 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T17:33:27.764236(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openchat/openchat_8192## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_8192 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T17:33:27.764236(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
4a6527ddb6112c9361b3f2a90f7c113a329efd97
|
# Dataset Card for Evaluation run of SaylorTwift/gpt2_test
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SaylorTwift/gpt2_test
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [SaylorTwift/gpt2_test](https://huggingface.co/SaylorTwift/gpt2_test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SaylorTwift__gpt2_test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T16:48:41.866587](https://huggingface.co/datasets/open-llm-leaderboard/details_SaylorTwift__gpt2_test/blob/main/results_2023-09-22T16-48-41.866587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514814,
"f1": 0.04780411073825513,
"f1_stderr": 0.0013732412097489425,
"acc": 0.25210824971442214,
"acc_stderr": 0.007783509925876779
},
"harness|drop|3": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514814,
"f1": 0.04780411073825513,
"f1_stderr": 0.0013732412097489425
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245488
},
"harness|winogrande|5": {
"acc": 0.5011838989739542,
"acc_stderr": 0.014052446290529009
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_SaylorTwift__gpt2_test
|
[
"region:us"
] |
2023-08-18T10:16:03+00:00
|
{"pretty_name": "Evaluation run of SaylorTwift/gpt2_test", "dataset_summary": "Dataset automatically created during the evaluation run of model [SaylorTwift/gpt2_test](https://huggingface.co/SaylorTwift/gpt2_test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SaylorTwift__gpt2_test\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T16:48:41.866587](https://huggingface.co/datasets/open-llm-leaderboard/details_SaylorTwift__gpt2_test/blob/main/results_2023-09-22T16-48-41.866587.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514814,\n \"f1\": 0.04780411073825513,\n \"f1_stderr\": 0.0013732412097489425,\n \"acc\": 0.25210824971442214,\n \"acc_stderr\": 0.007783509925876779\n },\n \"harness|drop|3\": {\n \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514814,\n \"f1\": 0.04780411073825513,\n \"f1_stderr\": 0.0013732412097489425\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245488\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5011838989739542,\n \"acc_stderr\": 0.014052446290529009\n }\n}\n```", "repo_url": "https://huggingface.co/SaylorTwift/gpt2_test", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T16_48_41.866587", "path": ["**/details_harness|drop|3_2023-09-22T16-48-41.866587.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T16-48-41.866587.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T16_48_41.866587", "path": ["**/details_harness|gsm8k|5_2023-09-22T16-48-41.866587.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T16-48-41.866587.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:08:58.298962.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:08:58.298962.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T16_48_41.866587", "path": ["**/details_harness|winogrande|5_2023-09-22T16-48-41.866587.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T16-48-41.866587.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_08_58.298962", "path": ["results_2023-07-19T19:08:58.298962.parquet"]}, {"split": "2023_09_22T16_48_41.866587", "path": ["results_2023-09-22T16-48-41.866587.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T16-48-41.866587.parquet"]}]}]}
|
2023-09-22T15:48:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of SaylorTwift/gpt2_test
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model SaylorTwift/gpt2_test on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T16:48:41.866587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of SaylorTwift/gpt2_test",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SaylorTwift/gpt2_test on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T16:48:41.866587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SaylorTwift/gpt2_test",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SaylorTwift/gpt2_test on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T16:48:41.866587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SaylorTwift/gpt2_test## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model SaylorTwift/gpt2_test on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T16:48:41.866587(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
465af39ff94d74d3f9faef3f77375cb3345645d0
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-774M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-774M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-774M](https://huggingface.co/nicholasKluge/Aira-Instruct-774M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-774M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T09:21:24.434701](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-774M/blob/main/results_2023-08-10T09%3A21%3A24.434701.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2509346011019478,
"acc_stderr": 0.031496346014330656,
"acc_norm": 0.25261641064354023,
"acc_norm_stderr": 0.03150876208925782,
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299972,
"mc2": 0.42741407501534695,
"mc2_stderr": 0.01548514736552057
},
"harness|arc:challenge|25": {
"acc": 0.24488054607508533,
"acc_stderr": 0.012566273985131358,
"acc_norm": 0.2815699658703072,
"acc_norm_stderr": 0.013143376735009024
},
"harness|hellaswag|10": {
"acc": 0.3481378211511651,
"acc_stderr": 0.004754063867700179,
"acc_norm": 0.4106751643098984,
"acc_norm_stderr": 0.004909509538525157
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2981132075471698,
"acc_stderr": 0.028152837942493875,
"acc_norm": 0.2981132075471698,
"acc_norm_stderr": 0.028152837942493875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23870967741935484,
"acc_stderr": 0.024251071262208834,
"acc_norm": 0.23870967741935484,
"acc_norm_stderr": 0.024251071262208834
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293753,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293753
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.029857515673386407,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.029857515673386407
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180362,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180362
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.2,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073845,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073845
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361273,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361273
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473834,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473834
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23853211009174313,
"acc_stderr": 0.01827257581023187,
"acc_norm": 0.23853211009174313,
"acc_norm_stderr": 0.01827257581023187
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690232,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690232
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2914798206278027,
"acc_stderr": 0.030500283176545906,
"acc_norm": 0.2914798206278027,
"acc_norm_stderr": 0.030500283176545906
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19008264462809918,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.19008264462809918,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.26380368098159507,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.26380368098159507,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283164,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283164
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2835249042145594,
"acc_stderr": 0.016117318166832283,
"acc_norm": 0.2835249042145594,
"acc_norm_stderr": 0.016117318166832283
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.02612957252718085,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.02612957252718085
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279336,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279336
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.17647058823529413,
"acc_stderr": 0.023157468308559373,
"acc_norm": 0.17647058823529413,
"acc_norm_stderr": 0.023157468308559373
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683228,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683228
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.19883040935672514,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.19883040935672514,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2521419828641371,
"mc1_stderr": 0.015201522246299972,
"mc2": 0.42741407501534695,
"mc2_stderr": 0.01548514736552057
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-774M
|
[
"region:us"
] |
2023-08-18T10:16:11+00:00
|
{"pretty_name": "Evaluation run of nicholasKluge/Aira-Instruct-774M", "dataset_summary": "Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-774M](https://huggingface.co/nicholasKluge/Aira-Instruct-774M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-774M\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-10T09:21:24.434701](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-774M/blob/main/results_2023-08-10T09%3A21%3A24.434701.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2509346011019478,\n \"acc_stderr\": 0.031496346014330656,\n \"acc_norm\": 0.25261641064354023,\n \"acc_norm_stderr\": 0.03150876208925782,\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299972,\n \"mc2\": 0.42741407501534695,\n \"mc2_stderr\": 0.01548514736552057\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.24488054607508533,\n \"acc_stderr\": 0.012566273985131358,\n \"acc_norm\": 0.2815699658703072,\n \"acc_norm_stderr\": 0.013143376735009024\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3481378211511651,\n \"acc_stderr\": 0.004754063867700179,\n \"acc_norm\": 0.4106751643098984,\n \"acc_norm_stderr\": 0.004909509538525157\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2981132075471698,\n \"acc_stderr\": 0.028152837942493875,\n \"acc_norm\": 0.2981132075471698,\n \"acc_norm_stderr\": 0.028152837942493875\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23870967741935484,\n \"acc_stderr\": 0.024251071262208834,\n \"acc_norm\": 0.23870967741935484,\n \"acc_norm_stderr\": 0.024251071262208834\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293753,\n \"acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293753\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.029857515673386407,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.029857515673386407\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180362,\n \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180362\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073845,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073845\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361273,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361273\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473834,\n \"acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473834\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.23853211009174313,\n \"acc_stderr\": 0.01827257581023187,\n \"acc_norm\": 0.23853211009174313,\n \"acc_norm_stderr\": 0.01827257581023187\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690232,\n \"acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690232\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604246,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604246\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2914798206278027,\n \"acc_stderr\": 0.030500283176545906,\n \"acc_norm\": 0.2914798206278027,\n \"acc_norm_stderr\": 0.030500283176545906\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.26380368098159507,\n \"acc_stderr\": 0.03462419931615624,\n \"acc_norm\": 0.26380368098159507,\n \"acc_norm_stderr\": 0.03462419931615624\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283164,\n \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283164\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2835249042145594,\n \"acc_stderr\": 0.016117318166832283,\n \"acc_norm\": 0.2835249042145594,\n \"acc_norm_stderr\": 0.016117318166832283\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.02612957252718085,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.02612957252718085\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n \"acc_stderr\": 0.011015752255279336,\n \"acc_norm\": 0.2470664928292047,\n \"acc_norm_stderr\": 0.011015752255279336\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.17647058823529413,\n \"acc_stderr\": 0.023157468308559373,\n \"acc_norm\": 0.17647058823529413,\n \"acc_norm_stderr\": 0.023157468308559373\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n \"acc_stderr\": 0.03647168523683228,\n \"acc_norm\": 0.3253012048192771,\n \"acc_norm_stderr\": 0.03647168523683228\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.19883040935672514,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.19883040935672514,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2521419828641371,\n \"mc1_stderr\": 0.015201522246299972,\n \"mc2\": 0.42741407501534695,\n \"mc2_stderr\": 0.01548514736552057\n }\n}\n```", "repo_url": "https://huggingface.co/nicholasKluge/Aira-Instruct-774M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|arc:challenge|25_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|arc:challenge|25_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hellaswag|10_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hellaswag|10_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T09:20:56.838686.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T09:21:24.434701.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T09:21:24.434701.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T09_20_56.838686", "path": ["results_2023-08-10T09:20:56.838686.parquet"]}, {"split": "2023_08_10T09_21_24.434701", "path": ["results_2023-08-10T09:21:24.434701.parquet"]}, {"split": "latest", "path": ["results_2023-08-10T09:21:24.434701.parquet"]}]}]}
|
2023-08-27T11:31:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-774M
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-774M on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-10T09:21:24.434701 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-774M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-774M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-10T09:21:24.434701 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-774M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-774M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-10T09:21:24.434701 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-774M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-774M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-10T09:21:24.434701 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3b7293f7fbfe3a36770324a80f9976cc0752997c
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-PT-1B7](https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T20:59:57.404122](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7/blob/main/results_2023-08-09T20%3A59%3A57.404122.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2495089085448401,
"acc_stderr": 0.03135286921160441,
"acc_norm": 0.2508452551647926,
"acc_norm_stderr": 0.03137437179137316,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055028,
"mc2": 0.4595409979303444,
"mc2_stderr": 0.01663090921738331
},
"harness|arc:challenge|25": {
"acc": 0.2030716723549488,
"acc_stderr": 0.011755899303705583,
"acc_norm": 0.2687713310580205,
"acc_norm_stderr": 0.012955065963710672
},
"harness|hellaswag|10": {
"acc": 0.25941047600079664,
"acc_stderr": 0.004374153847826759,
"acc_norm": 0.2725552678749253,
"acc_norm_stderr": 0.004443639394177424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.26037735849056604,
"acc_stderr": 0.027008766090708087,
"acc_norm": 0.26037735849056604,
"acc_norm_stderr": 0.027008766090708087
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.028185441301234102,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.028185441301234102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2870967741935484,
"acc_stderr": 0.025736542745594525,
"acc_norm": 0.2870967741935484,
"acc_norm_stderr": 0.025736542745594525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.03027690994517826,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.03027690994517826
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.0347918557259966,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.0347918557259966
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.344954128440367,
"acc_stderr": 0.020380605405066966,
"acc_norm": 0.344954128440367,
"acc_norm_stderr": 0.020380605405066966
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.23628691983122363,
"acc_stderr": 0.027652153144159263,
"acc_norm": 0.23628691983122363,
"acc_norm_stderr": 0.027652153144159263
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.029105220833224615,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.029105220833224615
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.21487603305785125,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.21487603305785125,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531773,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531773
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.02361867831006937,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.02361867831006937
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.02392915551735129,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.02392915551735129
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.21543408360128619,
"acc_stderr": 0.023350225475471425,
"acc_norm": 0.21543408360128619,
"acc_norm_stderr": 0.023350225475471425
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.02346842983245114,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.02346842983245114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180844,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322263,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724138,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724138
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24489795918367346,
"acc_stderr": 0.027529637440174934,
"acc_norm": 0.24489795918367346,
"acc_norm_stderr": 0.027529637440174934
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.03076944496729601,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.03076944496729601
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055028,
"mc2": 0.4595409979303444,
"mc2_stderr": 0.01663090921738331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7
|
[
"region:us"
] |
2023-08-18T10:16:22+00:00
|
{"pretty_name": "Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7", "dataset_summary": "Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-PT-1B7](https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-09T20:59:57.404122](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-PT-1B7/blob/main/results_2023-08-09T20%3A59%3A57.404122.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2495089085448401,\n \"acc_stderr\": 0.03135286921160441,\n \"acc_norm\": 0.2508452551647926,\n \"acc_norm_stderr\": 0.03137437179137316,\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055028,\n \"mc2\": 0.4595409979303444,\n \"mc2_stderr\": 0.01663090921738331\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2030716723549488,\n \"acc_stderr\": 0.011755899303705583,\n \"acc_norm\": 0.2687713310580205,\n \"acc_norm_stderr\": 0.012955065963710672\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25941047600079664,\n \"acc_stderr\": 0.004374153847826759,\n \"acc_norm\": 0.2725552678749253,\n \"acc_norm_stderr\": 0.004443639394177424\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.26037735849056604,\n \"acc_stderr\": 0.027008766090708087,\n \"acc_norm\": 0.26037735849056604,\n \"acc_norm_stderr\": 0.027008766090708087\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.028185441301234102,\n \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.028185441301234102\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2870967741935484,\n \"acc_stderr\": 0.025736542745594525,\n \"acc_norm\": 0.2870967741935484,\n \"acc_norm_stderr\": 0.025736542745594525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.033744026441394036,\n \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.033744026441394036\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270286,\n \"acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.03027690994517826,\n \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.03027690994517826\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.23841059602649006,\n \"acc_stderr\": 0.0347918557259966,\n \"acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.0347918557259966\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.344954128440367,\n \"acc_stderr\": 0.020380605405066966,\n \"acc_norm\": 0.344954128440367,\n \"acc_norm_stderr\": 0.020380605405066966\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.23628691983122363,\n \"acc_stderr\": 0.027652153144159263,\n \"acc_norm\": 0.23628691983122363,\n \"acc_norm_stderr\": 0.027652153144159263\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n \"acc_stderr\": 0.029105220833224615,\n \"acc_norm\": 0.25112107623318386,\n \"acc_norm_stderr\": 0.029105220833224615\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.21487603305785125,\n \"acc_stderr\": 0.03749492448709698,\n \"acc_norm\": 0.21487603305785125,\n \"acc_norm_stderr\": 0.03749492448709698\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531773,\n \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531773\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.02361867831006937,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.02361867831006937\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.02392915551735129,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.02392915551735129\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.21543408360128619,\n \"acc_stderr\": 0.023350225475471425,\n \"acc_norm\": 0.21543408360128619,\n \"acc_norm_stderr\": 0.023350225475471425\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.02346842983245114,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.02346842983245114\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.029097209568411952,\n \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.029097209568411952\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322263,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322263\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.03895091015724138,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.03895091015724138\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24489795918367346,\n \"acc_stderr\": 0.027529637440174934,\n \"acc_norm\": 0.24489795918367346,\n \"acc_norm_stderr\": 0.027529637440174934\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n \"acc_stderr\": 0.03076944496729601,\n \"acc_norm\": 0.2537313432835821,\n \"acc_norm_stderr\": 0.03076944496729601\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n \"mc1_stderr\": 0.014706994909055028,\n \"mc2\": 0.4595409979303444,\n \"mc2_stderr\": 0.01663090921738331\n }\n}\n```", "repo_url": "https://huggingface.co/nicholasKluge/Aira-Instruct-PT-1B7", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:59:57.404122.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T20_59_57.404122", "path": ["results_2023-08-09T20:59:57.404122.parquet"]}, {"split": "latest", "path": ["results_2023-08-09T20:59:57.404122.parquet"]}]}]}
|
2023-08-27T11:31:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-PT-1B7 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-09T20:59:57.404122 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-PT-1B7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-09T20:59:57.404122 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-PT-1B7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-09T20:59:57.404122 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-PT-1B7## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-PT-1B7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-09T20:59:57.404122 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0e99e3a0594ccff62dfd2da525c68a8ea1f234c9
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-355M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-355M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-355M](https://huggingface.co/nicholasKluge/Aira-Instruct-355M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T09:16:32.685819](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M/blob/main/results_2023-08-10T09%3A16%3A32.685819.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26193708195623533,
"acc_stderr": 0.03182336083684077,
"acc_norm": 0.263783264473725,
"acc_norm_stderr": 0.03183912280555913,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520674,
"mc2": 0.4107912986493598,
"mc2_stderr": 0.014545912502288488
},
"harness|arc:challenge|25": {
"acc": 0.23890784982935154,
"acc_stderr": 0.012461071376316621,
"acc_norm": 0.28668941979522183,
"acc_norm_stderr": 0.013214986329274765
},
"harness|hellaswag|10": {
"acc": 0.3311093407687712,
"acc_stderr": 0.004696505101217406,
"acc_norm": 0.39225253933479387,
"acc_norm_stderr": 0.004872546302641832
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.034260594244031654,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.034260594244031654
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670716,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670716
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.225531914893617,
"acc_stderr": 0.027321078417387536,
"acc_norm": 0.225531914893617,
"acc_norm_stderr": 0.027321078417387536
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.03395490020856113,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.03395490020856113
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24516129032258063,
"acc_stderr": 0.02447224384089553,
"acc_norm": 0.24516129032258063,
"acc_norm_stderr": 0.02447224384089553
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233483,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233483
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2474747474747475,
"acc_stderr": 0.030746300742124505,
"acc_norm": 0.2474747474747475,
"acc_norm_stderr": 0.030746300742124505
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.26424870466321243,
"acc_stderr": 0.03182155050916647,
"acc_norm": 0.26424870466321243,
"acc_norm_stderr": 0.03182155050916647
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.023119362758232287,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.023119362758232287
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969653,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969653
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3376146788990826,
"acc_stderr": 0.020275265986638903,
"acc_norm": 0.3376146788990826,
"acc_norm_stderr": 0.020275265986638903
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.028353212866863438,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.028353212866863438
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501936,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501936
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.26905829596412556,
"acc_stderr": 0.029763779406874972,
"acc_norm": 0.26905829596412556,
"acc_norm_stderr": 0.029763779406874972
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340455,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340455
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809446,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809446
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3001277139208174,
"acc_stderr": 0.01638924969131741,
"acc_norm": 0.3001277139208174,
"acc_norm_stderr": 0.01638924969131741
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.29260450160771706,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.29260450160771706,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307857,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307857
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203495,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203495
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.028418208619406794,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.028418208619406794
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727735,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727735
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014652,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014652
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.034106466140718564,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.034106466140718564
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.015127427096520674,
"mc2": 0.4107912986493598,
"mc2_stderr": 0.014545912502288488
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M
|
[
"region:us"
] |
2023-08-18T10:16:30+00:00
|
{"pretty_name": "Evaluation run of nicholasKluge/Aira-Instruct-355M", "dataset_summary": "Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-355M](https://huggingface.co/nicholasKluge/Aira-Instruct-355M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-10T09:16:32.685819](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-355M/blob/main/results_2023-08-10T09%3A16%3A32.685819.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26193708195623533,\n \"acc_stderr\": 0.03182336083684077,\n \"acc_norm\": 0.263783264473725,\n \"acc_norm_stderr\": 0.03183912280555913,\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520674,\n \"mc2\": 0.4107912986493598,\n \"mc2_stderr\": 0.014545912502288488\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.23890784982935154,\n \"acc_stderr\": 0.012461071376316621,\n \"acc_norm\": 0.28668941979522183,\n \"acc_norm_stderr\": 0.013214986329274765\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3311093407687712,\n \"acc_stderr\": 0.004696505101217406,\n \"acc_norm\": 0.39225253933479387,\n \"acc_norm_stderr\": 0.004872546302641832\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.034260594244031654,\n \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.034260594244031654\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670716,\n \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670716\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653694,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653694\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.225531914893617,\n \"acc_stderr\": 0.027321078417387536,\n \"acc_norm\": 0.225531914893617,\n \"acc_norm_stderr\": 0.027321078417387536\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n \"acc_stderr\": 0.03395490020856113,\n \"acc_norm\": 0.1746031746031746,\n \"acc_norm_stderr\": 0.03395490020856113\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24516129032258063,\n \"acc_stderr\": 0.02447224384089553,\n \"acc_norm\": 0.24516129032258063,\n \"acc_norm_stderr\": 0.02447224384089553\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233483,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233483\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2474747474747475,\n \"acc_stderr\": 0.030746300742124505,\n \"acc_norm\": 0.2474747474747475,\n \"acc_norm_stderr\": 0.030746300742124505\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.03182155050916647,\n \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.03182155050916647\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.023119362758232287,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.023119362758232287\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\": 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3376146788990826,\n \"acc_stderr\": 0.020275265986638903,\n \"acc_norm\": 0.3376146788990826,\n \"acc_norm_stderr\": 0.020275265986638903\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.028353212866863438,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.028353212866863438\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501936,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501936\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460305,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460305\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.26905829596412556,\n \"acc_stderr\": 0.029763779406874972,\n \"acc_norm\": 0.26905829596412556,\n \"acc_norm_stderr\": 0.029763779406874972\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04287858751340455,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04287858751340455\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02934311479809446,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02934311479809446\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197768,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197768\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3001277139208174,\n \"acc_stderr\": 0.01638924969131741,\n \"acc_norm\": 0.3001277139208174,\n \"acc_norm_stderr\": 0.01638924969131741\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.02440439492808787,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.02440439492808787\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.29260450160771706,\n \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.29260450160771706,\n \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967284,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967284\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307857,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307857\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n \"acc_stderr\": 0.01092649610203495,\n \"acc_norm\": 0.24119947848761408,\n \"acc_norm_stderr\": 0.01092649610203495\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.028418208619406794,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.028418208619406794\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n \"acc_stderr\": 0.040139645540727735,\n \"acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.040139645540727735\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.030360490154014652,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.030360490154014652\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.034106466140718564,\n \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.034106466140718564\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n \"mc1_stderr\": 0.015127427096520674,\n \"mc2\": 0.4107912986493598,\n \"mc2_stderr\": 0.014545912502288488\n }\n}\n```", "repo_url": "https://huggingface.co/nicholasKluge/Aira-Instruct-355M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|arc:challenge|25_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hellaswag|10_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T09:16:32.685819.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T09_16_32.685819", "path": ["results_2023-08-10T09:16:32.685819.parquet"]}, {"split": "latest", "path": ["results_2023-08-10T09:16:32.685819.parquet"]}]}]}
|
2023-08-27T11:31:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-355M
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-355M on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-10T09:16:32.685819 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-355M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-355M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-10T09:16:32.685819 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-355M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-355M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-10T09:16:32.685819 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-355M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-355M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-10T09:16:32.685819 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
314e305cb0ab60f66f2c9ba475b8eeb4fe0d55b5
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-1B5
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-1B5
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-1B5](https://huggingface.co/nicholasKluge/Aira-Instruct-1B5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-1B5",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-09T20:50:15.527085](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-1B5/blob/main/results_2023-08-09T20%3A50%3A15.527085.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26661052193576085,
"acc_stderr": 0.03205800992365353,
"acc_norm": 0.26841070974417724,
"acc_norm_stderr": 0.03207045668897672,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.401147792785823,
"mc2_stderr": 0.01487516040985077
},
"harness|arc:challenge|25": {
"acc": 0.25,
"acc_stderr": 0.012653835621466646,
"acc_norm": 0.2883959044368601,
"acc_norm_stderr": 0.013238394422428162
},
"harness|hellaswag|10": {
"acc": 0.35829516032662817,
"acc_stderr": 0.004785195049889159,
"acc_norm": 0.4261103365863374,
"acc_norm_stderr": 0.004934995402995939
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3169811320754717,
"acc_stderr": 0.028637235639800942,
"acc_norm": 0.3169811320754717,
"acc_norm_stderr": 0.028637235639800942
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.040233822736177476,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.040233822736177476
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.18723404255319148,
"acc_stderr": 0.02550158834188361,
"acc_norm": 0.18723404255319148,
"acc_norm_stderr": 0.02550158834188361
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.022789673145776575,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.022789673145776575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041153,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041153
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.02152596540740872,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.02152596540740872
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.2184873949579832,
"acc_stderr": 0.026841514322958948,
"acc_norm": 0.2184873949579832,
"acc_norm_stderr": 0.026841514322958948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3431192660550459,
"acc_stderr": 0.02035477773608604,
"acc_norm": 0.3431192660550459,
"acc_norm_stderr": 0.02035477773608604
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24537037037037038,
"acc_stderr": 0.02934666509437294,
"acc_norm": 0.24537037037037038,
"acc_norm_stderr": 0.02934666509437294
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693268,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693268
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2066115702479339,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.2066115702479339,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.28205128205128205,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.28205128205128205,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2937420178799489,
"acc_stderr": 0.016287759388491675,
"acc_norm": 0.2937420178799489,
"acc_norm_stderr": 0.016287759388491675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.02353292543104428,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.02353292543104428
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225622,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225622
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2908496732026144,
"acc_stderr": 0.02600480036395211,
"acc_norm": 0.2908496732026144,
"acc_norm_stderr": 0.02600480036395211
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2508038585209003,
"acc_stderr": 0.024619771956697168,
"acc_norm": 0.2508038585209003,
"acc_norm_stderr": 0.024619771956697168
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2646675358539765,
"acc_stderr": 0.011267332992845535,
"acc_norm": 0.2646675358539765,
"acc_norm_stderr": 0.011267332992845535
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.39591836734693875,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.39591836734693875,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-virology|5": {
"acc": 0.30120481927710846,
"acc_stderr": 0.03571609230053481,
"acc_norm": 0.30120481927710846,
"acc_norm_stderr": 0.03571609230053481
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807763,
"mc2": 0.401147792785823,
"mc2_stderr": 0.01487516040985077
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-1B5
|
[
"region:us"
] |
2023-08-18T10:16:39+00:00
|
{"pretty_name": "Evaluation run of nicholasKluge/Aira-Instruct-1B5", "dataset_summary": "Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-1B5](https://huggingface.co/nicholasKluge/Aira-Instruct-1B5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-1B5\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-09T20:50:15.527085](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-1B5/blob/main/results_2023-08-09T20%3A50%3A15.527085.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26661052193576085,\n \"acc_stderr\": 0.03205800992365353,\n \"acc_norm\": 0.26841070974417724,\n \"acc_norm_stderr\": 0.03207045668897672,\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.401147792785823,\n \"mc2_stderr\": 0.01487516040985077\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.012653835621466646,\n \"acc_norm\": 0.2883959044368601,\n \"acc_norm_stderr\": 0.013238394422428162\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.35829516032662817,\n \"acc_stderr\": 0.004785195049889159,\n \"acc_norm\": 0.4261103365863374,\n \"acc_norm_stderr\": 0.004934995402995939\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.3169811320754717,\n \"acc_stderr\": 0.028637235639800942,\n \"acc_norm\": 0.3169811320754717,\n \"acc_norm_stderr\": 0.028637235639800942\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.040233822736177476,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.040233822736177476\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.18723404255319148,\n \"acc_stderr\": 0.02550158834188361,\n \"acc_norm\": 0.18723404255319148,\n \"acc_norm_stderr\": 0.02550158834188361\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2671957671957672,\n \"acc_stderr\": 0.022789673145776575,\n \"acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.022789673145776575\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270286,\n \"acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270286\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041153,\n \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041153\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.02152596540740872,\n \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.02152596540740872\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.2184873949579832,\n \"acc_stderr\": 0.026841514322958948,\n \"acc_norm\": 0.2184873949579832,\n \"acc_norm_stderr\": 0.026841514322958948\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3431192660550459,\n \"acc_stderr\": 0.02035477773608604,\n \"acc_norm\": 0.3431192660550459,\n \"acc_norm_stderr\": 0.02035477773608604\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.24537037037037038,\n \"acc_stderr\": 0.02934666509437294,\n \"acc_norm\": 0.24537037037037038,\n \"acc_norm_stderr\": 0.02934666509437294\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693268,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693268\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.32286995515695066,\n \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2066115702479339,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.2066115702479339,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326468,\n \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326468\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.28205128205128205,\n \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.28205128205128205,\n \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2937420178799489,\n \"acc_stderr\": 0.016287759388491675,\n \"acc_norm\": 0.2937420178799489,\n \"acc_norm_stderr\": 0.016287759388491675\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.02353292543104428,\n \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.02353292543104428\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.014400296429225622,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.014400296429225622\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2908496732026144,\n \"acc_stderr\": 0.02600480036395211,\n \"acc_norm\": 0.2908496732026144,\n \"acc_norm_stderr\": 0.02600480036395211\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2508038585209003,\n \"acc_stderr\": 0.024619771956697168,\n \"acc_norm\": 0.2508038585209003,\n \"acc_norm_stderr\": 0.024619771956697168\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2646675358539765,\n \"acc_stderr\": 0.011267332992845535,\n \"acc_norm\": 0.2646675358539765,\n \"acc_norm_stderr\": 0.011267332992845535\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.39591836734693875,\n \"acc_stderr\": 0.03130802899065686,\n \"acc_norm\": 0.39591836734693875,\n \"acc_norm_stderr\": 0.03130802899065686\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.30120481927710846,\n \"acc_stderr\": 0.03571609230053481,\n \"acc_norm\": 0.30120481927710846,\n \"acc_norm_stderr\": 0.03571609230053481\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n \"mc1_stderr\": 0.014734557959807763,\n \"mc2\": 0.401147792785823,\n \"mc2_stderr\": 0.01487516040985077\n }\n}\n```", "repo_url": "https://huggingface.co/nicholasKluge/Aira-Instruct-1B5", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:49:00.649710.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:50:15.527085.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:50:15.527085.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T20_49_00.649710", "path": ["results_2023-08-09T20:49:00.649710.parquet"]}, {"split": "2023_08_09T20_50_15.527085", "path": ["results_2023-08-09T20:50:15.527085.parquet"]}, {"split": "latest", "path": ["results_2023-08-09T20:50:15.527085.parquet"]}]}]}
|
2023-08-27T11:32:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-1B5
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-1B5 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-09T20:50:15.527085 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-1B5",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-1B5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-09T20:50:15.527085 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-1B5",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-1B5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-09T20:50:15.527085 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-1B5## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-1B5 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-09T20:50:15.527085 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
37c330f3b8471fe4bf7708a38499a0016b11d43a
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-124M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/nicholasKluge/Aira-Instruct-124M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-124M](https://huggingface.co/nicholasKluge/Aira-Instruct-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-10T09:14:16.516035](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M/blob/main/results_2023-08-10T09%3A14%3A16.516035.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25097821278031224,
"acc_stderr": 0.03126312568682377,
"acc_norm": 0.25197883172295243,
"acc_norm_stderr": 0.03127882498671644,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.3793773096260545,
"mc2_stderr": 0.01493606177741941
},
"harness|arc:challenge|25": {
"acc": 0.19368600682593856,
"acc_stderr": 0.01154842540997854,
"acc_norm": 0.2354948805460751,
"acc_norm_stderr": 0.012399451855004753
},
"harness|hellaswag|10": {
"acc": 0.2909778928500299,
"acc_stderr": 0.004532850566893526,
"acc_norm": 0.3082055367456682,
"acc_norm_stderr": 0.004608082815535503
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.03345036916788992,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.03345036916788992
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.0285048564705142,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.0285048564705142
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.022644212615525218,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.022644212615525218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2,
"acc_stderr": 0.022755204959542936,
"acc_norm": 0.2,
"acc_norm_stderr": 0.022755204959542936
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.0319474007226554,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.0319474007226554
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.29015544041450775,
"acc_stderr": 0.03275264467791516,
"acc_norm": 0.29015544041450775,
"acc_norm_stderr": 0.03275264467791516
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24102564102564103,
"acc_stderr": 0.021685546665333195,
"acc_norm": 0.24102564102564103,
"acc_norm_stderr": 0.021685546665333195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.1722689075630252,
"acc_stderr": 0.02452866497130541,
"acc_norm": 0.1722689075630252,
"acc_norm_stderr": 0.02452866497130541
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17218543046357615,
"acc_stderr": 0.030826136961962385,
"acc_norm": 0.17218543046357615,
"acc_norm_stderr": 0.030826136961962385
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28440366972477066,
"acc_stderr": 0.019342036587702605,
"acc_norm": 0.28440366972477066,
"acc_norm_stderr": 0.019342036587702605
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31645569620253167,
"acc_stderr": 0.03027497488021898,
"acc_norm": 0.31645569620253167,
"acc_norm_stderr": 0.03027497488021898
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.10762331838565023,
"acc_stderr": 0.02079940008287998,
"acc_norm": 0.10762331838565023,
"acc_norm_stderr": 0.02079940008287998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2554278416347382,
"acc_stderr": 0.015594955384455765,
"acc_norm": 0.2554278416347382,
"acc_norm_stderr": 0.015594955384455765
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.01437816988409841,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.01437816988409841
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.18971061093247588,
"acc_stderr": 0.022268196258783218,
"acc_norm": 0.18971061093247588,
"acc_norm_stderr": 0.022268196258783218
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.0230167056402622,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.0230167056402622
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24511082138200782,
"acc_stderr": 0.010986307870045517,
"acc_norm": 0.24511082138200782,
"acc_norm_stderr": 0.010986307870045517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3161764705882353,
"acc_stderr": 0.028245687391462916,
"acc_norm": 0.3161764705882353,
"acc_norm_stderr": 0.028245687391462916
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.30612244897959184,
"acc_stderr": 0.029504896454595957,
"acc_norm": 0.30612244897959184,
"acc_norm_stderr": 0.029504896454595957
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348377,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348377
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.23493975903614459,
"acc_stderr": 0.03300533186128922,
"acc_norm": 0.23493975903614459,
"acc_norm_stderr": 0.03300533186128922
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.036155076303109344,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.036155076303109344
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.3793773096260545,
"mc2_stderr": 0.01493606177741941
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M
|
[
"region:us"
] |
2023-08-18T10:16:50+00:00
|
{"pretty_name": "Evaluation run of nicholasKluge/Aira-Instruct-124M", "dataset_summary": "Dataset automatically created during the evaluation run of model [nicholasKluge/Aira-Instruct-124M](https://huggingface.co/nicholasKluge/Aira-Instruct-124M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-10T09:14:16.516035](https://huggingface.co/datasets/open-llm-leaderboard/details_nicholasKluge__Aira-Instruct-124M/blob/main/results_2023-08-10T09%3A14%3A16.516035.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25097821278031224,\n \"acc_stderr\": 0.03126312568682377,\n \"acc_norm\": 0.25197883172295243,\n \"acc_norm_stderr\": 0.03127882498671644,\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.3793773096260545,\n \"mc2_stderr\": 0.01493606177741941\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.19368600682593856,\n \"acc_stderr\": 0.01154842540997854,\n \"acc_norm\": 0.2354948805460751,\n \"acc_norm_stderr\": 0.012399451855004753\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2909778928500299,\n \"acc_stderr\": 0.004532850566893526,\n \"acc_norm\": 0.3082055367456682,\n \"acc_norm_stderr\": 0.004608082815535503\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.03345036916788992,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.03345036916788992\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.0285048564705142,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.0285048564705142\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2619047619047619,\n \"acc_stderr\": 0.022644212615525218,\n \"acc_norm\": 0.2619047619047619,\n \"acc_norm_stderr\": 0.022644212615525218\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.022755204959542936,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.022755204959542936\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.0319474007226554,\n \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.0319474007226554\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070644,\n \"acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070644\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.29015544041450775,\n \"acc_stderr\": 0.03275264467791516,\n \"acc_norm\": 0.29015544041450775,\n \"acc_norm_stderr\": 0.03275264467791516\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.24102564102564103,\n \"acc_stderr\": 0.021685546665333195,\n \"acc_norm\": 0.24102564102564103,\n \"acc_norm_stderr\": 0.021685546665333195\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.1722689075630252,\n \"acc_stderr\": 0.02452866497130541,\n \"acc_norm\": 0.1722689075630252,\n \"acc_norm_stderr\": 0.02452866497130541\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.17218543046357615,\n \"acc_stderr\": 0.030826136961962385,\n \"acc_norm\": 0.17218543046357615,\n \"acc_norm_stderr\": 0.030826136961962385\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.28440366972477066,\n \"acc_stderr\": 0.019342036587702605,\n \"acc_norm\": 0.28440366972477066,\n \"acc_norm_stderr\": 0.019342036587702605\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.31645569620253167,\n \"acc_stderr\": 0.03027497488021898,\n \"acc_norm\": 0.31645569620253167,\n \"acc_norm_stderr\": 0.03027497488021898\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.10762331838565023,\n \"acc_stderr\": 0.02079940008287998,\n \"acc_norm\": 0.10762331838565023,\n \"acc_norm_stderr\": 0.02079940008287998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352167,\n \"acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352167\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.29464285714285715,\n \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2554278416347382,\n \"acc_stderr\": 0.015594955384455765,\n \"acc_norm\": 0.2554278416347382,\n \"acc_norm_stderr\": 0.015594955384455765\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.01437816988409841,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.01437816988409841\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.18971061093247588,\n \"acc_stderr\": 0.022268196258783218,\n \"acc_norm\": 0.18971061093247588,\n \"acc_norm_stderr\": 0.022268196258783218\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.0230167056402622,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.0230167056402622\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24511082138200782,\n \"acc_stderr\": 0.010986307870045517,\n \"acc_norm\": 0.24511082138200782,\n \"acc_norm_stderr\": 0.010986307870045517\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3161764705882353,\n \"acc_stderr\": 0.028245687391462916,\n \"acc_norm\": 0.3161764705882353,\n \"acc_norm_stderr\": 0.028245687391462916\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.30612244897959184,\n \"acc_stderr\": 0.029504896454595957,\n \"acc_norm\": 0.30612244897959184,\n \"acc_norm_stderr\": 0.029504896454595957\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n \"acc_stderr\": 0.029929415408348377,\n \"acc_norm\": 0.23383084577114427,\n \"acc_norm_stderr\": 0.029929415408348377\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.23493975903614459,\n \"acc_stderr\": 0.03300533186128922,\n \"acc_norm\": 0.23493975903614459,\n \"acc_norm_stderr\": 0.03300533186128922\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.036155076303109344,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.036155076303109344\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.3793773096260545,\n \"mc2_stderr\": 0.01493606177741941\n }\n}\n```", "repo_url": "https://huggingface.co/nicholasKluge/Aira-Instruct-124M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|arc:challenge|25_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hellaswag|10_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T09:14:16.516035.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T09_14_16.516035", "path": ["results_2023-08-10T09:14:16.516035.parquet"]}, {"split": "latest", "path": ["results_2023-08-10T09:14:16.516035.parquet"]}]}]}
|
2023-08-27T11:32:02+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-124M
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-124M on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-10T09:14:16.516035 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-124M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-124M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-10T09:14:16.516035 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-124M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-124M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-10T09:14:16.516035 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of nicholasKluge/Aira-Instruct-124M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model nicholasKluge/Aira-Instruct-124M on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-10T09:14:16.516035 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
020ffb1ab778ae80a5463dbb9752906e8c6bc840
|
# Dataset Card for Evaluation run of Henk717/airochronos-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Henk717/airochronos-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Henk717/airochronos-33B](https://huggingface.co/Henk717/airochronos-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Henk717__airochronos-33B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T22:07:20.672645](https://huggingface.co/datasets/open-llm-leaderboard/details_Henk717__airochronos-33B/blob/main/results_2023-09-17T22-07-20.672645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436351,
"f1": 0.06925440436241624,
"f1_stderr": 0.0014771385536763682,
"acc": 0.46521874156655235,
"acc_stderr": 0.010430187536918111
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436351,
"f1": 0.06925440436241624,
"f1_stderr": 0.0014771385536763682
},
"harness|gsm8k|5": {
"acc": 0.1372251705837756,
"acc_stderr": 0.009477808244600422
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235798
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Henk717__airochronos-33B
|
[
"region:us"
] |
2023-08-18T10:16:59+00:00
|
{"pretty_name": "Evaluation run of Henk717/airochronos-33B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Henk717/airochronos-33B](https://huggingface.co/Henk717/airochronos-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Henk717__airochronos-33B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T22:07:20.672645](https://huggingface.co/datasets/open-llm-leaderboard/details_Henk717__airochronos-33B/blob/main/results_2023-09-17T22-07-20.672645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436351,\n \"f1\": 0.06925440436241624,\n \"f1_stderr\": 0.0014771385536763682,\n \"acc\": 0.46521874156655235,\n \"acc_stderr\": 0.010430187536918111\n },\n \"harness|drop|3\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436351,\n \"f1\": 0.06925440436241624,\n \"f1_stderr\": 0.0014771385536763682\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1372251705837756,\n \"acc_stderr\": 0.009477808244600422\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235798\n }\n}\n```", "repo_url": "https://huggingface.co/Henk717/airochronos-33B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|arc:challenge|25_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T22_55_10.209177", "path": ["**/details_harness|drop|3_2023-09-16T22-55-10.209177.parquet"]}, {"split": "2023_09_17T00_16_43.512970", "path": ["**/details_harness|drop|3_2023-09-17T00-16-43.512970.parquet"]}, {"split": "2023_09_17T22_07_20.672645", "path": ["**/details_harness|drop|3_2023-09-17T22-07-20.672645.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T22-07-20.672645.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T22_55_10.209177", "path": ["**/details_harness|gsm8k|5_2023-09-16T22-55-10.209177.parquet"]}, {"split": "2023_09_17T00_16_43.512970", "path": ["**/details_harness|gsm8k|5_2023-09-17T00-16-43.512970.parquet"]}, {"split": "2023_09_17T22_07_20.672645", "path": ["**/details_harness|gsm8k|5_2023-09-17T22-07-20.672645.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T22-07-20.672645.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hellaswag|10_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T12:26:49.704789.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T12:26:49.704789.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T12:26:49.704789.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T22_55_10.209177", "path": ["**/details_harness|winogrande|5_2023-09-16T22-55-10.209177.parquet"]}, {"split": "2023_09_17T00_16_43.512970", "path": ["**/details_harness|winogrande|5_2023-09-17T00-16-43.512970.parquet"]}, {"split": "2023_09_17T22_07_20.672645", "path": ["**/details_harness|winogrande|5_2023-09-17T22-07-20.672645.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T22-07-20.672645.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T12_26_49.704789", "path": ["results_2023-08-17T12:26:49.704789.parquet"]}, {"split": "2023_09_16T22_55_10.209177", "path": ["results_2023-09-16T22-55-10.209177.parquet"]}, {"split": "2023_09_17T00_16_43.512970", "path": ["results_2023-09-17T00-16-43.512970.parquet"]}, {"split": "2023_09_17T22_07_20.672645", "path": ["results_2023-09-17T22-07-20.672645.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T22-07-20.672645.parquet"]}]}]}
|
2023-09-17T21:07:28+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Henk717/airochronos-33B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Henk717/airochronos-33B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T22:07:20.672645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Henk717/airochronos-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Henk717/airochronos-33B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T22:07:20.672645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Henk717/airochronos-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Henk717/airochronos-33B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T22:07:20.672645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Henk717/airochronos-33B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Henk717/airochronos-33B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T22:07:20.672645(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
1c2f1a228b2cefeefa8dce90dd2d98bcdc2bd55c
|
# Dataset Card for Evaluation run of Henk717/chronoboros-33B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Henk717/chronoboros-33B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Henk717/chronoboros-33B](https://huggingface.co/Henk717/chronoboros-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Henk717__chronoboros-33B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T16:31:44.204572](https://huggingface.co/datasets/open-llm-leaderboard/details_Henk717__chronoboros-33B/blob/main/results_2023-10-12T16-31-44.204572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.06618078859060408,
"f1_stderr": 0.001414224388811973,
"acc": 0.4767932464203287,
"acc_stderr": 0.010503355727238265
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.00040584511324177333,
"f1": 0.06618078859060408,
"f1_stderr": 0.001414224388811973
},
"harness|gsm8k|5": {
"acc": 0.15011372251705837,
"acc_stderr": 0.009838590860906965
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569565
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Henk717__chronoboros-33B
|
[
"region:us"
] |
2023-08-18T10:17:08+00:00
|
{"pretty_name": "Evaluation run of Henk717/chronoboros-33B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Henk717/chronoboros-33B](https://huggingface.co/Henk717/chronoboros-33B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Henk717__chronoboros-33B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T16:31:44.204572](https://huggingface.co/datasets/open-llm-leaderboard/details_Henk717__chronoboros-33B/blob/main/results_2023-10-12T16-31-44.204572.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.06618078859060408,\n \"f1_stderr\": 0.001414224388811973,\n \"acc\": 0.4767932464203287,\n \"acc_stderr\": 0.010503355727238265\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.00040584511324177333,\n \"f1\": 0.06618078859060408,\n \"f1_stderr\": 0.001414224388811973\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15011372251705837,\n \"acc_stderr\": 0.009838590860906965\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569565\n }\n}\n```", "repo_url": "https://huggingface.co/Henk717/chronoboros-33B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T18_14_16.970701", "path": ["**/details_harness|drop|3_2023-09-22T18-14-16.970701.parquet"]}, {"split": "2023_10_12T16_31_44.204572", "path": ["**/details_harness|drop|3_2023-10-12T16-31-44.204572.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T16-31-44.204572.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T18_14_16.970701", "path": ["**/details_harness|gsm8k|5_2023-09-22T18-14-16.970701.parquet"]}, {"split": "2023_10_12T16_31_44.204572", "path": ["**/details_harness|gsm8k|5_2023-10-12T16-31-44.204572.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T16-31-44.204572.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:28:54.015412.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:28:54.015412.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:28:54.015412.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T18_14_16.970701", "path": ["**/details_harness|winogrande|5_2023-09-22T18-14-16.970701.parquet"]}, {"split": "2023_10_12T16_31_44.204572", "path": ["**/details_harness|winogrande|5_2023-10-12T16-31-44.204572.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T16-31-44.204572.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T13_28_54.015412", "path": ["results_2023-07-31T13:28:54.015412.parquet"]}, {"split": "2023_09_22T18_14_16.970701", "path": ["results_2023-09-22T18-14-16.970701.parquet"]}, {"split": "2023_10_12T16_31_44.204572", "path": ["results_2023-10-12T16-31-44.204572.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T16-31-44.204572.parquet"]}]}]}
|
2023-10-12T15:31:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Henk717/chronoboros-33B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Henk717/chronoboros-33B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-12T16:31:44.204572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Henk717/chronoboros-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Henk717/chronoboros-33B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T16:31:44.204572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Henk717/chronoboros-33B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Henk717/chronoboros-33B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T16:31:44.204572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Henk717/chronoboros-33B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Henk717/chronoboros-33B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T16:31:44.204572(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0668b8943bab74402cf91ba088e89764adc876a1
|
# Dataset Card for Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [NbAiLab/nb-gpt-j-6B-alpaca](https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T15:55:19.313530](https://huggingface.co/datasets/open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca/blob/main/results_2023-07-19T15%3A55%3A19.313530.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2793361324347659,
"acc_stderr": 0.03229367242405252,
"acc_norm": 0.2819200448548028,
"acc_norm_stderr": 0.03229676260655511,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3799804264451723,
"mc2_stderr": 0.014771856203795355
},
"harness|arc:challenge|25": {
"acc": 0.3447098976109215,
"acc_stderr": 0.013888816286782114,
"acc_norm": 0.36860068259385664,
"acc_norm_stderr": 0.014097810678042184
},
"harness|hellaswag|10": {
"acc": 0.44602668791077477,
"acc_stderr": 0.004960624576987787,
"acc_norm": 0.574586735710018,
"acc_norm_stderr": 0.004933950953380902
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.039446241625011175,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.039446241625011175
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17105263157894737,
"acc_stderr": 0.030643607071677088,
"acc_norm": 0.17105263157894737,
"acc_norm_stderr": 0.030643607071677088
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756193,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756193
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.0336876293225943,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.0336876293225943
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319619,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319619
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322674,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322674
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238156,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238156
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03718489006818115,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03718489006818115
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.23548387096774193,
"acc_stderr": 0.02413763242933772,
"acc_norm": 0.23548387096774193,
"acc_norm_stderr": 0.02413763242933772
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.03127090713297698,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.03127090713297698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.35858585858585856,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.35858585858585856,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35751295336787564,
"acc_stderr": 0.034588160421810045,
"acc_norm": 0.35751295336787564,
"acc_norm_stderr": 0.034588160421810045
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23846153846153847,
"acc_stderr": 0.02160629449464773,
"acc_norm": 0.23846153846153847,
"acc_norm_stderr": 0.02160629449464773
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.025649470265889193,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.025649470265889193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.326605504587156,
"acc_stderr": 0.020106990889937303,
"acc_norm": 0.326605504587156,
"acc_norm_stderr": 0.020106990889937303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674093,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674093
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460302,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460302
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.2600896860986547,
"acc_stderr": 0.029442495585857487,
"acc_norm": 0.2600896860986547,
"acc_norm_stderr": 0.029442495585857487
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.20610687022900764,
"acc_stderr": 0.035477710041594626,
"acc_norm": 0.20610687022900764,
"acc_norm_stderr": 0.035477710041594626
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083498,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083498
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.294478527607362,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.294478527607362,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16964285714285715,
"acc_stderr": 0.0356236785009539,
"acc_norm": 0.16964285714285715,
"acc_norm_stderr": 0.0356236785009539
},
"harness|hendrycksTest-management|5": {
"acc": 0.20388349514563106,
"acc_stderr": 0.03989139859531772,
"acc_norm": 0.20388349514563106,
"acc_norm_stderr": 0.03989139859531772
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.21794871794871795,
"acc_stderr": 0.02704685763071666,
"acc_norm": 0.21794871794871795,
"acc_norm_stderr": 0.02704685763071666
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3052362707535121,
"acc_stderr": 0.01646771194763513,
"acc_norm": 0.3052362707535121,
"acc_norm_stderr": 0.01646771194763513
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2861271676300578,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.2861271676300578,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27262569832402234,
"acc_stderr": 0.014893391735249588,
"acc_norm": 0.27262569832402234,
"acc_norm_stderr": 0.014893391735249588
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.02656892101545715,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.02656892101545715
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2958199356913183,
"acc_stderr": 0.025922371788818777,
"acc_norm": 0.2958199356913183,
"acc_norm_stderr": 0.025922371788818777
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.22839506172839505,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.22839506172839505,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537776,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537776
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981657,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981657
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.018054027458815194,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.018054027458815194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3877551020408163,
"acc_stderr": 0.03119223072679566,
"acc_norm": 0.3877551020408163,
"acc_norm_stderr": 0.03119223072679566
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.27710843373493976,
"acc_stderr": 0.03484331592680588,
"acc_norm": 0.27710843373493976,
"acc_norm_stderr": 0.03484331592680588
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2573099415204678,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.2573099415204678,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871114,
"mc2": 0.3799804264451723,
"mc2_stderr": 0.014771856203795355
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca
|
[
"region:us"
] |
2023-08-18T10:17:16+00:00
|
{"pretty_name": "Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [NbAiLab/nb-gpt-j-6B-alpaca](https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-19T15:55:19.313530](https://huggingface.co/datasets/open-llm-leaderboard/details_NbAiLab__nb-gpt-j-6B-alpaca/blob/main/results_2023-07-19T15%3A55%3A19.313530.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2793361324347659,\n \"acc_stderr\": 0.03229367242405252,\n \"acc_norm\": 0.2819200448548028,\n \"acc_norm_stderr\": 0.03229676260655511,\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3799804264451723,\n \"mc2_stderr\": 0.014771856203795355\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3447098976109215,\n \"acc_stderr\": 0.013888816286782114,\n \"acc_norm\": 0.36860068259385664,\n \"acc_norm_stderr\": 0.014097810678042184\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44602668791077477,\n \"acc_stderr\": 0.004960624576987787,\n \"acc_norm\": 0.574586735710018,\n \"acc_norm_stderr\": 0.004933950953380902\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.039446241625011175,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.039446241625011175\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17105263157894737,\n \"acc_stderr\": 0.030643607071677088,\n \"acc_norm\": 0.17105263157894737,\n \"acc_norm_stderr\": 0.030643607071677088\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756193,\n \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756193\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3472222222222222,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.3472222222222222,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.0336876293225943,\n \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.0336876293225943\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319619,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319619\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322674,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322674\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238156,\n \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238156\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03718489006818115,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03718489006818115\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.23548387096774193,\n \"acc_stderr\": 0.02413763242933772,\n \"acc_norm\": 0.23548387096774193,\n \"acc_norm_stderr\": 0.02413763242933772\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.03127090713297698,\n \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.03127090713297698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.35858585858585856,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.35858585858585856,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.35751295336787564,\n \"acc_stderr\": 0.034588160421810045,\n \"acc_norm\": 0.35751295336787564,\n \"acc_norm_stderr\": 0.034588160421810045\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.23846153846153847,\n \"acc_stderr\": 0.02160629449464773,\n \"acc_norm\": 0.23846153846153847,\n \"acc_norm_stderr\": 0.02160629449464773\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.025649470265889193,\n \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.025649470265889193\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.326605504587156,\n \"acc_stderr\": 0.020106990889937303,\n \"acc_norm\": 0.326605504587156,\n \"acc_norm_stderr\": 0.020106990889937303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674093,\n \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674093\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460302,\n \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460302\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.2600896860986547,\n \"acc_stderr\": 0.029442495585857487,\n \"acc_norm\": 0.2600896860986547,\n \"acc_norm_stderr\": 0.029442495585857487\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.20610687022900764,\n \"acc_stderr\": 0.035477710041594626,\n \"acc_norm\": 0.20610687022900764,\n \"acc_norm_stderr\": 0.035477710041594626\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083498,\n \"acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083498\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.294478527607362,\n \"acc_stderr\": 0.03581165790474082,\n \"acc_norm\": 0.294478527607362,\n \"acc_norm_stderr\": 0.03581165790474082\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16964285714285715,\n \"acc_stderr\": 0.0356236785009539,\n \"acc_norm\": 0.16964285714285715,\n \"acc_norm_stderr\": 0.0356236785009539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.20388349514563106,\n \"acc_stderr\": 0.03989139859531772,\n \"acc_norm\": 0.20388349514563106,\n \"acc_norm_stderr\": 0.03989139859531772\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.21794871794871795,\n \"acc_stderr\": 0.02704685763071666,\n \"acc_norm\": 0.21794871794871795,\n \"acc_norm_stderr\": 0.02704685763071666\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3052362707535121,\n \"acc_stderr\": 0.01646771194763513,\n \"acc_norm\": 0.3052362707535121,\n \"acc_norm_stderr\": 0.01646771194763513\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.2861271676300578,\n \"acc_stderr\": 0.024332146779134128,\n \"acc_norm\": 0.2861271676300578,\n \"acc_norm_stderr\": 0.024332146779134128\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.02656892101545715,\n \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.02656892101545715\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2958199356913183,\n \"acc_stderr\": 0.025922371788818777,\n \"acc_norm\": 0.2958199356913183,\n \"acc_norm_stderr\": 0.025922371788818777\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.22839506172839505,\n \"acc_stderr\": 0.023358211840626267,\n \"acc_norm\": 0.22839506172839505,\n \"acc_norm_stderr\": 0.023358211840626267\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537776,\n \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537776\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n \"acc_stderr\": 0.010906282617981657,\n \"acc_norm\": 0.23989569752281617,\n \"acc_norm_stderr\": 0.010906282617981657\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.030161911930767102,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.030161911930767102\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.018054027458815194,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.018054027458815194\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3877551020408163,\n \"acc_stderr\": 0.03119223072679566,\n \"acc_norm\": 0.3877551020408163,\n \"acc_norm_stderr\": 0.03119223072679566\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.21393034825870647,\n \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.27710843373493976,\n \"acc_stderr\": 0.03484331592680588,\n \"acc_norm\": 0.27710843373493976,\n \"acc_norm_stderr\": 0.03484331592680588\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2573099415204678,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.2573099415204678,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n \"mc1_stderr\": 0.014869755015871114,\n \"mc2\": 0.3799804264451723,\n \"mc2_stderr\": 0.014771856203795355\n }\n}\n```", "repo_url": "https://huggingface.co/NbAiLab/nb-gpt-j-6B-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:55:19.313530.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_55_19.313530", "path": ["results_2023-07-19T15:55:19.313530.parquet"]}, {"split": "latest", "path": ["results_2023-07-19T15:55:19.313530.parquet"]}]}]}
|
2023-08-27T11:32:07+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model NbAiLab/nb-gpt-j-6B-alpaca on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-07-19T15:55:19.313530 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NbAiLab/nb-gpt-j-6B-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-19T15:55:19.313530 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model NbAiLab/nb-gpt-j-6B-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-19T15:55:19.313530 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of NbAiLab/nb-gpt-j-6B-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model NbAiLab/nb-gpt-j-6B-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-19T15:55:19.313530 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3b07acf96f1ff0cf18a21b79f8fba9e15ba2f833
|
# Dataset Card for Evaluation run of mncai/chatdoctor
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/mncai/chatdoctor
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [mncai/chatdoctor](https://huggingface.co/mncai/chatdoctor) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mncai__chatdoctor",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T01:48:31.701330](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__chatdoctor/blob/main/results_2023-09-17T01-48-31.701330.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.22640520134228187,
"em_stderr": 0.004285876197711522,
"f1": 0.3016862416107395,
"f1_stderr": 0.004314877276433696,
"acc": 0.34964483030781374,
"acc_stderr": 0.006444005247352365
},
"harness|drop|3": {
"em": 0.22640520134228187,
"em_stderr": 0.004285876197711522,
"f1": 0.3016862416107395,
"f1_stderr": 0.004314877276433696
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6992896606156275,
"acc_stderr": 0.01288801049470473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_mncai__chatdoctor
|
[
"region:us"
] |
2023-08-18T10:17:24+00:00
|
{"pretty_name": "Evaluation run of mncai/chatdoctor", "dataset_summary": "Dataset automatically created during the evaluation run of model [mncai/chatdoctor](https://huggingface.co/mncai/chatdoctor) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mncai__chatdoctor\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T01:48:31.701330](https://huggingface.co/datasets/open-llm-leaderboard/details_mncai__chatdoctor/blob/main/results_2023-09-17T01-48-31.701330.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.22640520134228187,\n \"em_stderr\": 0.004285876197711522,\n \"f1\": 0.3016862416107395,\n \"f1_stderr\": 0.004314877276433696,\n \"acc\": 0.34964483030781374,\n \"acc_stderr\": 0.006444005247352365\n },\n \"harness|drop|3\": {\n \"em\": 0.22640520134228187,\n \"em_stderr\": 0.004285876197711522,\n \"f1\": 0.3016862416107395,\n \"f1_stderr\": 0.004314877276433696\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6992896606156275,\n \"acc_stderr\": 0.01288801049470473\n }\n}\n```", "repo_url": "https://huggingface.co/mncai/chatdoctor", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T01_48_31.701330", "path": ["**/details_harness|drop|3_2023-09-17T01-48-31.701330.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T01-48-31.701330.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T01_48_31.701330", "path": ["**/details_harness|gsm8k|5_2023-09-17T01-48-31.701330.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T01-48-31.701330.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:52:02.947837.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:52:02.947837.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:52:02.947837.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T01_48_31.701330", "path": ["**/details_harness|winogrande|5_2023-09-17T01-48-31.701330.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T01-48-31.701330.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T15_52_02.947837", "path": ["results_2023-07-24T15:52:02.947837.parquet"]}, {"split": "2023_09_17T01_48_31.701330", "path": ["results_2023-09-17T01-48-31.701330.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T01-48-31.701330.parquet"]}]}]}
|
2023-09-17T00:48:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of mncai/chatdoctor
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model mncai/chatdoctor on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T01:48:31.701330(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of mncai/chatdoctor",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mncai/chatdoctor on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T01:48:31.701330(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mncai/chatdoctor",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model mncai/chatdoctor on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T01:48:31.701330(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
164,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mncai/chatdoctor## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model mncai/chatdoctor on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T01:48:31.701330(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
413282cccd2472416327ff987ef13043409dfb97
|
# Dataset Card for Evaluation run of bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16](https://huggingface.co/bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhenrym14__airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T11:51:04.890467](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16/blob/main/results_2023-09-23T11-51-04.890467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05044043624161074,
"em_stderr": 0.002241249338683634,
"f1": 0.1215436241610737,
"f1_stderr": 0.002582740471791708,
"acc": 0.4114226953164035,
"acc_stderr": 0.01004007069157239
},
"harness|drop|3": {
"em": 0.05044043624161074,
"em_stderr": 0.002241249338683634,
"f1": 0.1215436241610737,
"f1_stderr": 0.002582740471791708
},
"harness|gsm8k|5": {
"acc": 0.08567096285064443,
"acc_stderr": 0.007709218855882771
},
"harness|winogrande|5": {
"acc": 0.7371744277821626,
"acc_stderr": 0.012370922527262008
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bhenrym14__airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16
|
[
"region:us"
] |
2023-08-18T10:17:33+00:00
|
{"pretty_name": "Evaluation run of bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16](https://huggingface.co/bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhenrym14__airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T11:51:04.890467](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16/blob/main/results_2023-09-23T11-51-04.890467.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05044043624161074,\n \"em_stderr\": 0.002241249338683634,\n \"f1\": 0.1215436241610737,\n \"f1_stderr\": 0.002582740471791708,\n \"acc\": 0.4114226953164035,\n \"acc_stderr\": 0.01004007069157239\n },\n \"harness|drop|3\": {\n \"em\": 0.05044043624161074,\n \"em_stderr\": 0.002241249338683634,\n \"f1\": 0.1215436241610737,\n \"f1_stderr\": 0.002582740471791708\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08567096285064443,\n \"acc_stderr\": 0.007709218855882771\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7371744277821626,\n \"acc_stderr\": 0.012370922527262008\n }\n}\n```", "repo_url": "https://huggingface.co/bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|arc:challenge|25_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T02_06_28.152881", "path": ["**/details_harness|drop|3_2023-09-23T02-06-28.152881.parquet"]}, {"split": "2023_09_23T11_51_04.890467", "path": ["**/details_harness|drop|3_2023-09-23T11-51-04.890467.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T11-51-04.890467.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T02_06_28.152881", "path": ["**/details_harness|gsm8k|5_2023-09-23T02-06-28.152881.parquet"]}, {"split": "2023_09_23T11_51_04.890467", "path": ["**/details_harness|gsm8k|5_2023-09-23T11-51-04.890467.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T11-51-04.890467.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hellaswag|10_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T13:44:06.910726.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T13:44:06.910726.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T13:44:06.910726.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T02_06_28.152881", "path": ["**/details_harness|winogrande|5_2023-09-23T02-06-28.152881.parquet"]}, {"split": "2023_09_23T11_51_04.890467", "path": ["**/details_harness|winogrande|5_2023-09-23T11-51-04.890467.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T11-51-04.890467.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T13_44_06.910726", "path": ["results_2023-08-09T13:44:06.910726.parquet"]}, {"split": "2023_09_23T02_06_28.152881", "path": ["results_2023-09-23T02-06-28.152881.parquet"]}, {"split": "2023_09_23T11_51_04.890467", "path": ["results_2023-09-23T11-51-04.890467.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T11-51-04.890467.parquet"]}]}]}
|
2023-09-23T10:51:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T11:51:04.890467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T11:51:04.890467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T11:51:04.890467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
39,
31,
187,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bhenrym14/airoboros-33b-gpt4-1.4.1-lxctx-PI-16384-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T11:51:04.890467(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
03438b602157c17ffb7d6897569fe7cd08d2a243
|
# Dataset Card for Evaluation run of bhenrym14/airophin-13b-pntk-16k-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bhenrym14/airophin-13b-pntk-16k-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bhenrym14/airophin-13b-pntk-16k-fp16](https://huggingface.co/bhenrym14/airophin-13b-pntk-16k-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bhenrym14__airophin-13b-pntk-16k-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T13:50:50.012213](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__airophin-13b-pntk-16k-fp16/blob/main/results_2023-09-22T13-50-50.012213.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12793624161073824,
"em_stderr": 0.003420665162123249,
"f1": 0.20498217281879216,
"f1_stderr": 0.0035591754960457413,
"acc": 0.421002792649235,
"acc_stderr": 0.009731603620470694
},
"harness|drop|3": {
"em": 0.12793624161073824,
"em_stderr": 0.003420665162123249,
"f1": 0.20498217281879216,
"f1_stderr": 0.0035591754960457413
},
"harness|gsm8k|5": {
"acc": 0.0803639120545868,
"acc_stderr": 0.007488258573239077
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.01197494866770231
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bhenrym14__airophin-13b-pntk-16k-fp16
|
[
"region:us"
] |
2023-08-18T10:17:42+00:00
|
{"pretty_name": "Evaluation run of bhenrym14/airophin-13b-pntk-16k-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [bhenrym14/airophin-13b-pntk-16k-fp16](https://huggingface.co/bhenrym14/airophin-13b-pntk-16k-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bhenrym14__airophin-13b-pntk-16k-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T13:50:50.012213](https://huggingface.co/datasets/open-llm-leaderboard/details_bhenrym14__airophin-13b-pntk-16k-fp16/blob/main/results_2023-09-22T13-50-50.012213.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12793624161073824,\n \"em_stderr\": 0.003420665162123249,\n \"f1\": 0.20498217281879216,\n \"f1_stderr\": 0.0035591754960457413,\n \"acc\": 0.421002792649235,\n \"acc_stderr\": 0.009731603620470694\n },\n \"harness|drop|3\": {\n \"em\": 0.12793624161073824,\n \"em_stderr\": 0.003420665162123249,\n \"f1\": 0.20498217281879216,\n \"f1_stderr\": 0.0035591754960457413\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0803639120545868,\n \"acc_stderr\": 0.007488258573239077\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.01197494866770231\n }\n}\n```", "repo_url": "https://huggingface.co/bhenrym14/airophin-13b-pntk-16k-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|arc:challenge|25_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T13_50_50.012213", "path": ["**/details_harness|drop|3_2023-09-22T13-50-50.012213.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T13-50-50.012213.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T13_50_50.012213", "path": ["**/details_harness|gsm8k|5_2023-09-22T13-50-50.012213.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T13-50-50.012213.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hellaswag|10_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T13:13:26.207427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T13:13:26.207427.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T13:13:26.207427.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T13_50_50.012213", "path": ["**/details_harness|winogrande|5_2023-09-22T13-50-50.012213.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T13-50-50.012213.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T13_13_26.207427", "path": ["results_2023-08-09T13:13:26.207427.parquet"]}, {"split": "2023_09_22T13_50_50.012213", "path": ["results_2023-09-22T13-50-50.012213.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T13-50-50.012213.parquet"]}]}]}
|
2023-09-22T12:51:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bhenrym14/airophin-13b-pntk-16k-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bhenrym14/airophin-13b-pntk-16k-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T13:50:50.012213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bhenrym14/airophin-13b-pntk-16k-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bhenrym14/airophin-13b-pntk-16k-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T13:50:50.012213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bhenrym14/airophin-13b-pntk-16k-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bhenrym14/airophin-13b-pntk-16k-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T13:50:50.012213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
30,
31,
178,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bhenrym14/airophin-13b-pntk-16k-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bhenrym14/airophin-13b-pntk-16k-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T13:50:50.012213(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
984afd67c64180972a5ad61566f96f7f0750a626
|
# Dataset Card for Evaluation run of ausboss/llama-30b-supercot
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ausboss/llama-30b-supercot
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [ausboss/llama-30b-supercot](https://huggingface.co/ausboss/llama-30b-supercot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ausboss__llama-30b-supercot",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T01:12:25.087238](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama-30b-supercot/blob/main/results_2023-10-16T01-12-25.087238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.11619127516778524,
"em_stderr": 0.0032817504432788346,
"f1": 0.19070469798657516,
"f1_stderr": 0.003401025373096678,
"acc": 0.45967263712374484,
"acc_stderr": 0.010077515646893735
},
"harness|drop|3": {
"em": 0.11619127516778524,
"em_stderr": 0.0032817504432788346,
"f1": 0.19070469798657516,
"f1_stderr": 0.003401025373096678
},
"harness|gsm8k|5": {
"acc": 0.11902956785443518,
"acc_stderr": 0.008919702911161629
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.01123532838262584
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_ausboss__llama-30b-supercot
|
[
"region:us"
] |
2023-08-18T10:17:51+00:00
|
{"pretty_name": "Evaluation run of ausboss/llama-30b-supercot", "dataset_summary": "Dataset automatically created during the evaluation run of model [ausboss/llama-30b-supercot](https://huggingface.co/ausboss/llama-30b-supercot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ausboss__llama-30b-supercot\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T01:12:25.087238](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama-30b-supercot/blob/main/results_2023-10-16T01-12-25.087238.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.11619127516778524,\n \"em_stderr\": 0.0032817504432788346,\n \"f1\": 0.19070469798657516,\n \"f1_stderr\": 0.003401025373096678,\n \"acc\": 0.45967263712374484,\n \"acc_stderr\": 0.010077515646893735\n },\n \"harness|drop|3\": {\n \"em\": 0.11619127516778524,\n \"em_stderr\": 0.0032817504432788346,\n \"f1\": 0.19070469798657516,\n \"f1_stderr\": 0.003401025373096678\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11902956785443518,\n \"acc_stderr\": 0.008919702911161629\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.01123532838262584\n }\n}\n```", "repo_url": "https://huggingface.co/ausboss/llama-30b-supercot", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T01_12_25.087238", "path": ["**/details_harness|drop|3_2023-10-16T01-12-25.087238.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T01-12-25.087238.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T01_12_25.087238", "path": ["**/details_harness|gsm8k|5_2023-10-16T01-12-25.087238.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T01-12-25.087238.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:24:52.456650.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:24:52.456650.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:24:52.456650.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T01_12_25.087238", "path": ["**/details_harness|winogrande|5_2023-10-16T01-12-25.087238.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T01-12-25.087238.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_24_52.456650", "path": ["results_2023-07-19T22:24:52.456650.parquet"]}, {"split": "2023_10_16T01_12_25.087238", "path": ["results_2023-10-16T01-12-25.087238.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T01-12-25.087238.parquet"]}]}]}
|
2023-10-16T00:12:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of ausboss/llama-30b-supercot
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ausboss/llama-30b-supercot on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T01:12:25.087238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of ausboss/llama-30b-supercot",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ausboss/llama-30b-supercot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T01:12:25.087238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ausboss/llama-30b-supercot",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ausboss/llama-30b-supercot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T01:12:25.087238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ausboss/llama-30b-supercot## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ausboss/llama-30b-supercot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T01:12:25.087238(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
196ccf1f5eac2f7063fdb0485e809e6c61faad6d
|
# Dataset Card for Evaluation run of ausboss/llama-13b-supercot
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ausboss/llama-13b-supercot
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [ausboss/llama-13b-supercot](https://huggingface.co/ausboss/llama-13b-supercot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ausboss__llama-13b-supercot",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T16:35:22.378010](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama-13b-supercot/blob/main/results_2023-10-12T16-35-22.378010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17722315436241612,
"em_stderr": 0.003910577643527697,
"f1": 0.2291652684563762,
"f1_stderr": 0.003972230197820301,
"acc": 0.41485980206717077,
"acc_stderr": 0.00958175025485596
},
"harness|drop|3": {
"em": 0.17722315436241612,
"em_stderr": 0.003910577643527697,
"f1": 0.2291652684563762,
"f1_stderr": 0.003972230197820301
},
"harness|gsm8k|5": {
"acc": 0.07202426080363912,
"acc_stderr": 0.00712114798353713
},
"harness|winogrande|5": {
"acc": 0.7576953433307024,
"acc_stderr": 0.012042352526174789
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_ausboss__llama-13b-supercot
|
[
"region:us"
] |
2023-08-18T10:17:59+00:00
|
{"pretty_name": "Evaluation run of ausboss/llama-13b-supercot", "dataset_summary": "Dataset automatically created during the evaluation run of model [ausboss/llama-13b-supercot](https://huggingface.co/ausboss/llama-13b-supercot) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ausboss__llama-13b-supercot\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T16:35:22.378010](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama-13b-supercot/blob/main/results_2023-10-12T16-35-22.378010.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17722315436241612,\n \"em_stderr\": 0.003910577643527697,\n \"f1\": 0.2291652684563762,\n \"f1_stderr\": 0.003972230197820301,\n \"acc\": 0.41485980206717077,\n \"acc_stderr\": 0.00958175025485596\n },\n \"harness|drop|3\": {\n \"em\": 0.17722315436241612,\n \"em_stderr\": 0.003910577643527697,\n \"f1\": 0.2291652684563762,\n \"f1_stderr\": 0.003972230197820301\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07202426080363912,\n \"acc_stderr\": 0.00712114798353713\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7576953433307024,\n \"acc_stderr\": 0.012042352526174789\n }\n}\n```", "repo_url": "https://huggingface.co/ausboss/llama-13b-supercot", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|arc:challenge|25_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T16_35_22.378010", "path": ["**/details_harness|drop|3_2023-10-12T16-35-22.378010.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T16-35-22.378010.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T16_35_22.378010", "path": ["**/details_harness|gsm8k|5_2023-10-12T16-35-22.378010.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T16-35-22.378010.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hellaswag|10_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T13:52:51.513214.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T13:52:51.513214.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T13:52:51.513214.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T16_35_22.378010", "path": ["**/details_harness|winogrande|5_2023-10-12T16-35-22.378010.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T16-35-22.378010.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T13_52_51.513214", "path": ["results_2023-07-18T13:52:51.513214.parquet"]}, {"split": "2023_10_12T16_35_22.378010", "path": ["results_2023-10-12T16-35-22.378010.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T16-35-22.378010.parquet"]}]}]}
|
2023-10-12T15:35:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of ausboss/llama-13b-supercot
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ausboss/llama-13b-supercot on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-12T16:35:22.378010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of ausboss/llama-13b-supercot",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ausboss/llama-13b-supercot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T16:35:22.378010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ausboss/llama-13b-supercot",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ausboss/llama-13b-supercot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T16:35:22.378010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ausboss/llama-13b-supercot## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ausboss/llama-13b-supercot on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T16:35:22.378010(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3870370a6ed8fdc0de1f287e63bd85589fa76ae5
|
# Dataset Card for Evaluation run of FreedomIntelligence/phoenix-inst-chat-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FreedomIntelligence/phoenix-inst-chat-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [FreedomIntelligence/phoenix-inst-chat-7b](https://huggingface.co/FreedomIntelligence/phoenix-inst-chat-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FreedomIntelligence__phoenix-inst-chat-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T23:22:30.864991](https://huggingface.co/datasets/open-llm-leaderboard/details_FreedomIntelligence__phoenix-inst-chat-7b/blob/main/results_2023-09-17T23-22-30.864991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.20962667785234898,
"em_stderr": 0.004168492875886018,
"f1": 0.26621015100671175,
"f1_stderr": 0.0042103390325487,
"acc": 0.32057213705582843,
"acc_stderr": 0.00834460377574627
},
"harness|drop|3": {
"em": 0.20962667785234898,
"em_stderr": 0.004168492875886018,
"f1": 0.26621015100671175,
"f1_stderr": 0.0042103390325487
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499664
},
"harness|winogrande|5": {
"acc": 0.6282557221783741,
"acc_stderr": 0.013582306284992875
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_FreedomIntelligence__phoenix-inst-chat-7b
|
[
"region:us"
] |
2023-08-18T10:18:08+00:00
|
{"pretty_name": "Evaluation run of FreedomIntelligence/phoenix-inst-chat-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [FreedomIntelligence/phoenix-inst-chat-7b](https://huggingface.co/FreedomIntelligence/phoenix-inst-chat-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FreedomIntelligence__phoenix-inst-chat-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T23:22:30.864991](https://huggingface.co/datasets/open-llm-leaderboard/details_FreedomIntelligence__phoenix-inst-chat-7b/blob/main/results_2023-09-17T23-22-30.864991.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.20962667785234898,\n \"em_stderr\": 0.004168492875886018,\n \"f1\": 0.26621015100671175,\n \"f1_stderr\": 0.0042103390325487,\n \"acc\": 0.32057213705582843,\n \"acc_stderr\": 0.00834460377574627\n },\n \"harness|drop|3\": {\n \"em\": 0.20962667785234898,\n \"em_stderr\": 0.004168492875886018,\n \"f1\": 0.26621015100671175,\n \"f1_stderr\": 0.0042103390325487\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \"acc_stderr\": 0.003106901266499664\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6282557221783741,\n \"acc_stderr\": 0.013582306284992875\n }\n}\n```", "repo_url": "https://huggingface.co/FreedomIntelligence/phoenix-inst-chat-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|arc:challenge|25_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T23_22_30.864991", "path": ["**/details_harness|drop|3_2023-09-17T23-22-30.864991.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T23-22-30.864991.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T23_22_30.864991", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-22-30.864991.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T23-22-30.864991.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hellaswag|10_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T11:30:37.977923.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T11:30:37.977923.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T11:30:37.977923.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T23_22_30.864991", "path": ["**/details_harness|winogrande|5_2023-09-17T23-22-30.864991.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T23-22-30.864991.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T11_30_37.977923", "path": ["results_2023-08-16T11:30:37.977923.parquet"]}, {"split": "2023_09_17T23_22_30.864991", "path": ["results_2023-09-17T23-22-30.864991.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T23-22-30.864991.parquet"]}]}]}
|
2023-09-17T22:22:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of FreedomIntelligence/phoenix-inst-chat-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model FreedomIntelligence/phoenix-inst-chat-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T23:22:30.864991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of FreedomIntelligence/phoenix-inst-chat-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model FreedomIntelligence/phoenix-inst-chat-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T23:22:30.864991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FreedomIntelligence/phoenix-inst-chat-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model FreedomIntelligence/phoenix-inst-chat-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T23:22:30.864991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
68,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FreedomIntelligence/phoenix-inst-chat-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FreedomIntelligence/phoenix-inst-chat-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T23:22:30.864991(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a7b9f570120545f4e165c426b67efb75aeeae16d
|
# Dataset Card for Evaluation run of Pirr/pythia-13b-deduped-green_devil
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Pirr/pythia-13b-deduped-green_devil
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Pirr/pythia-13b-deduped-green_devil](https://huggingface.co/Pirr/pythia-13b-deduped-green_devil) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Pirr__pythia-13b-deduped-green_devil",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T10:52:35.771401](https://huggingface.co/datasets/open-llm-leaderboard/details_Pirr__pythia-13b-deduped-green_devil/blob/main/results_2023-10-28T10-52-35.771401.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857083,
"f1": 0.04843435402684591,
"f1_stderr": 0.0011963744934619489,
"acc": 0.34526287822984214,
"acc_stderr": 0.008596442508425665
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857083,
"f1": 0.04843435402684591,
"f1_stderr": 0.0011963744934619489
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
},
"harness|winogrande|5": {
"acc": 0.6692975532754538,
"acc_stderr": 0.013222435887002696
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Pirr__pythia-13b-deduped-green_devil
|
[
"region:us"
] |
2023-08-18T10:18:17+00:00
|
{"pretty_name": "Evaluation run of Pirr/pythia-13b-deduped-green_devil", "dataset_summary": "Dataset automatically created during the evaluation run of model [Pirr/pythia-13b-deduped-green_devil](https://huggingface.co/Pirr/pythia-13b-deduped-green_devil) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Pirr__pythia-13b-deduped-green_devil\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T10:52:35.771401](https://huggingface.co/datasets/open-llm-leaderboard/details_Pirr__pythia-13b-deduped-green_devil/blob/main/results_2023-10-28T10-52-35.771401.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857083,\n \"f1\": 0.04843435402684591,\n \"f1_stderr\": 0.0011963744934619489,\n \"acc\": 0.34526287822984214,\n \"acc_stderr\": 0.008596442508425665\n },\n \"harness|drop|3\": {\n \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857083,\n \"f1\": 0.04843435402684591,\n \"f1_stderr\": 0.0011963744934619489\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \"acc_stderr\": 0.003970449129848635\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6692975532754538,\n \"acc_stderr\": 0.013222435887002696\n }\n}\n```", "repo_url": "https://huggingface.co/Pirr/pythia-13b-deduped-green_devil", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T10_52_35.771401", "path": ["**/details_harness|drop|3_2023-10-28T10-52-35.771401.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T10-52-35.771401.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T10_52_35.771401", "path": ["**/details_harness|gsm8k|5_2023-10-28T10-52-35.771401.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T10-52-35.771401.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:06:19.404792.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:06:19.404792.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:06:19.404792.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T10_52_35.771401", "path": ["**/details_harness|winogrande|5_2023-10-28T10-52-35.771401.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T10-52-35.771401.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T14_06_19.404792", "path": ["results_2023-07-18T14:06:19.404792.parquet"]}, {"split": "2023_10_28T10_52_35.771401", "path": ["results_2023-10-28T10-52-35.771401.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T10-52-35.771401.parquet"]}]}]}
|
2023-10-28T09:52:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Pirr/pythia-13b-deduped-green_devil
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Pirr/pythia-13b-deduped-green_devil on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T10:52:35.771401(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Pirr/pythia-13b-deduped-green_devil",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Pirr/pythia-13b-deduped-green_devil on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T10:52:35.771401(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Pirr/pythia-13b-deduped-green_devil",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Pirr/pythia-13b-deduped-green_devil on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T10:52:35.771401(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Pirr/pythia-13b-deduped-green_devil## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Pirr/pythia-13b-deduped-green_devil on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T10:52:35.771401(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
cdf34ee4357d4f0b4adc84ccfbc4eec1b60b441f
|
# Dataset Card for Evaluation run of alibidaran/medical_transcription_generator
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/alibidaran/medical_transcription_generator
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [alibidaran/medical_transcription_generator](https://huggingface.co/alibidaran/medical_transcription_generator) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alibidaran__medical_transcription_generator",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T21:21:57.279571](https://huggingface.co/datasets/open-llm-leaderboard/details_alibidaran__medical_transcription_generator/blob/main/results_2023-10-24T21-21-57.279571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670043,
"f1": 0.014220847315436273,
"f1_stderr": 0.0008227845342657542,
"acc": 0.2521704814522494,
"acc_stderr": 0.00702597803203845
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670043,
"f1": 0.014220847315436273,
"f1_stderr": 0.0008227845342657542
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.0140519560640769
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_alibidaran__medical_transcription_generator
|
[
"region:us"
] |
2023-08-18T10:18:25+00:00
|
{"pretty_name": "Evaluation run of alibidaran/medical_transcription_generator", "dataset_summary": "Dataset automatically created during the evaluation run of model [alibidaran/medical_transcription_generator](https://huggingface.co/alibidaran/medical_transcription_generator) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alibidaran__medical_transcription_generator\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T21:21:57.279571](https://huggingface.co/datasets/open-llm-leaderboard/details_alibidaran__medical_transcription_generator/blob/main/results_2023-10-24T21-21-57.279571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626670043,\n \"f1\": 0.014220847315436273,\n \"f1_stderr\": 0.0008227845342657542,\n \"acc\": 0.2521704814522494,\n \"acc_stderr\": 0.00702597803203845\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626670043,\n \"f1\": 0.014220847315436273,\n \"f1_stderr\": 0.0008227845342657542\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n \"acc_stderr\": 0.0140519560640769\n }\n}\n```", "repo_url": "https://huggingface.co/alibidaran/medical_transcription_generator", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T21_21_57.279571", "path": ["**/details_harness|drop|3_2023-10-24T21-21-57.279571.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T21-21-57.279571.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T21_21_57.279571", "path": ["**/details_harness|gsm8k|5_2023-10-24T21-21-57.279571.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T21-21-57.279571.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:11:07.660746.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:11:07.660746.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:11:07.660746.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T21_21_57.279571", "path": ["**/details_harness|winogrande|5_2023-10-24T21-21-57.279571.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T21-21-57.279571.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_11_07.660746", "path": ["results_2023-07-19T19:11:07.660746.parquet"]}, {"split": "2023_10_24T21_21_57.279571", "path": ["results_2023-10-24T21-21-57.279571.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T21-21-57.279571.parquet"]}]}]}
|
2023-10-24T20:22:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of alibidaran/medical_transcription_generator
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model alibidaran/medical_transcription_generator on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T21:21:57.279571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of alibidaran/medical_transcription_generator",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model alibidaran/medical_transcription_generator on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T21:21:57.279571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of alibidaran/medical_transcription_generator",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model alibidaran/medical_transcription_generator on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T21:21:57.279571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of alibidaran/medical_transcription_generator## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model alibidaran/medical_transcription_generator on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T21:21:57.279571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0d085c6fb1354d3b144f6b7011f2eee5965684b4
|
# Dataset Card for Evaluation run of ToolBench/ToolLLaMA-7b-LoRA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ToolBench/ToolLLaMA-7b-LoRA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [ToolBench/ToolLLaMA-7b-LoRA](https://huggingface.co/ToolBench/ToolLLaMA-7b-LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ToolBench__ToolLLaMA-7b-LoRA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T01:38:32.661486](https://huggingface.co/datasets/open-llm-leaderboard/details_ToolBench__ToolLLaMA-7b-LoRA/blob/main/results_2023-09-18T01-38-32.661486.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652191404,
"f1": 0.056097944630872455,
"f1_stderr": 0.001312187728090684,
"acc": 0.40586103293913917,
"acc_stderr": 0.00960950347641371
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652191404,
"f1": 0.056097944630872455,
"f1_stderr": 0.001312187728090684
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7434885556432518,
"acc_stderr": 0.01227364800875999
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_ToolBench__ToolLLaMA-7b-LoRA
|
[
"region:us"
] |
2023-08-18T10:18:34+00:00
|
{"pretty_name": "Evaluation run of ToolBench/ToolLLaMA-7b-LoRA", "dataset_summary": "Dataset automatically created during the evaluation run of model [ToolBench/ToolLLaMA-7b-LoRA](https://huggingface.co/ToolBench/ToolLLaMA-7b-LoRA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ToolBench__ToolLLaMA-7b-LoRA\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T01:38:32.661486](https://huggingface.co/datasets/open-llm-leaderboard/details_ToolBench__ToolLLaMA-7b-LoRA/blob/main/results_2023-09-18T01-38-32.661486.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652191404,\n \"f1\": 0.056097944630872455,\n \"f1_stderr\": 0.001312187728090684,\n \"acc\": 0.40586103293913917,\n \"acc_stderr\": 0.00960950347641371\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652191404,\n \"f1\": 0.056097944630872455,\n \"f1_stderr\": 0.001312187728090684\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7434885556432518,\n \"acc_stderr\": 0.01227364800875999\n }\n}\n```", "repo_url": "https://huggingface.co/ToolBench/ToolLLaMA-7b-LoRA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T01_38_32.661486", "path": ["**/details_harness|drop|3_2023-09-18T01-38-32.661486.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T01-38-32.661486.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T01_38_32.661486", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-38-32.661486.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T01-38-32.661486.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:09:39.923597.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:09:39.923597.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:09:39.923597.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T01_38_32.661486", "path": ["**/details_harness|winogrande|5_2023-09-18T01-38-32.661486.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T01-38-32.661486.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T17_09_39.923597", "path": ["results_2023-08-09T17:09:39.923597.parquet"]}, {"split": "2023_09_18T01_38_32.661486", "path": ["results_2023-09-18T01-38-32.661486.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T01-38-32.661486.parquet"]}]}]}
|
2023-09-18T00:38:44+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of ToolBench/ToolLLaMA-7b-LoRA
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ToolBench/ToolLLaMA-7b-LoRA on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-18T01:38:32.661486(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of ToolBench/ToolLLaMA-7b-LoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ToolBench/ToolLLaMA-7b-LoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T01:38:32.661486(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ToolBench/ToolLLaMA-7b-LoRA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ToolBench/ToolLLaMA-7b-LoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T01:38:32.661486(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ToolBench/ToolLLaMA-7b-LoRA## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ToolBench/ToolLLaMA-7b-LoRA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T01:38:32.661486(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
84429ec88af87b06e5aa794ec5cd7091179d5193
|
# Dataset Card for Evaluation run of medalpaca/medalpaca-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/medalpaca/medalpaca-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [medalpaca/medalpaca-7b](https://huggingface.co/medalpaca/medalpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_medalpaca__medalpaca-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-13T02:37:55.174881](https://huggingface.co/datasets/open-llm-leaderboard/details_medalpaca__medalpaca-7b/blob/main/results_2023-10-13T02-37-55.174881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1761744966442953,
"em_stderr": 0.003901474629801755,
"f1": 0.24214345637583887,
"f1_stderr": 0.003972046949089224,
"acc": 0.37112196044335327,
"acc_stderr": 0.008725686094881443
},
"harness|drop|3": {
"em": 0.1761744966442953,
"em_stderr": 0.003901474629801755,
"f1": 0.24214345637583887,
"f1_stderr": 0.003972046949089224
},
"harness|gsm8k|5": {
"acc": 0.030326004548900682,
"acc_stderr": 0.004723487465514772
},
"harness|winogrande|5": {
"acc": 0.7119179163378059,
"acc_stderr": 0.012727884724248115
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_medalpaca__medalpaca-7b
|
[
"region:us"
] |
2023-08-18T10:18:43+00:00
|
{"pretty_name": "Evaluation run of medalpaca/medalpaca-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [medalpaca/medalpaca-7b](https://huggingface.co/medalpaca/medalpaca-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_medalpaca__medalpaca-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-13T02:37:55.174881](https://huggingface.co/datasets/open-llm-leaderboard/details_medalpaca__medalpaca-7b/blob/main/results_2023-10-13T02-37-55.174881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1761744966442953,\n \"em_stderr\": 0.003901474629801755,\n \"f1\": 0.24214345637583887,\n \"f1_stderr\": 0.003972046949089224,\n \"acc\": 0.37112196044335327,\n \"acc_stderr\": 0.008725686094881443\n },\n \"harness|drop|3\": {\n \"em\": 0.1761744966442953,\n \"em_stderr\": 0.003901474629801755,\n \"f1\": 0.24214345637583887,\n \"f1_stderr\": 0.003972046949089224\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.030326004548900682,\n \"acc_stderr\": 0.004723487465514772\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7119179163378059,\n \"acc_stderr\": 0.012727884724248115\n }\n}\n```", "repo_url": "https://huggingface.co/medalpaca/medalpaca-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_13T02_37_55.174881", "path": ["**/details_harness|drop|3_2023-10-13T02-37-55.174881.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-13T02-37-55.174881.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_13T02_37_55.174881", "path": ["**/details_harness|gsm8k|5_2023-10-13T02-37-55.174881.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-13T02-37-55.174881.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:30:25.304813.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:30:25.304813.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_13T02_37_55.174881", "path": ["**/details_harness|winogrande|5_2023-10-13T02-37-55.174881.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-13T02-37-55.174881.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_30_25.304813", "path": ["results_2023-07-19T16:30:25.304813.parquet"]}, {"split": "2023_10_13T02_37_55.174881", "path": ["results_2023-10-13T02-37-55.174881.parquet"]}, {"split": "latest", "path": ["results_2023-10-13T02-37-55.174881.parquet"]}]}]}
|
2023-10-13T01:38:07+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of medalpaca/medalpaca-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model medalpaca/medalpaca-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-13T02:37:55.174881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of medalpaca/medalpaca-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model medalpaca/medalpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T02:37:55.174881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of medalpaca/medalpaca-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model medalpaca/medalpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-13T02:37:55.174881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of medalpaca/medalpaca-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model medalpaca/medalpaca-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-13T02:37:55.174881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c360e03d37c0c1d91204e69fe23a37314a6714bf
|
# Dataset Card for Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [SebastianSchramm/Cerebras-GPT-111M-instruction](https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T16:31:53.265956](https://huggingface.co/datasets/open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction/blob/main/results_2023-10-24T16-31-53.265956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208799,
"f1": 0.0016642197986577185,
"f1_stderr": 0.00029156266897188764,
"acc": 0.2580899763220205,
"acc_stderr": 0.007022563065489298
},
"harness|drop|3": {
"em": 0.00010486577181208053,
"em_stderr": 0.00010486577181208799,
"f1": 0.0016642197986577185,
"f1_stderr": 0.00029156266897188764
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978596
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction
|
[
"region:us"
] |
2023-08-18T10:18:52+00:00
|
{"pretty_name": "Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction", "dataset_summary": "Dataset automatically created during the evaluation run of model [SebastianSchramm/Cerebras-GPT-111M-instruction](https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T16:31:53.265956](https://huggingface.co/datasets/open-llm-leaderboard/details_SebastianSchramm__Cerebras-GPT-111M-instruction/blob/main/results_2023-10-24T16-31-53.265956.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.00010486577181208799,\n \"f1\": 0.0016642197986577185,\n \"f1_stderr\": 0.00029156266897188764,\n \"acc\": 0.2580899763220205,\n \"acc_stderr\": 0.007022563065489298\n },\n \"harness|drop|3\": {\n \"em\": 0.00010486577181208053,\n \"em_stderr\": 0.00010486577181208799,\n \"f1\": 0.0016642197986577185,\n \"f1_stderr\": 0.00029156266897188764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.516179952644041,\n \"acc_stderr\": 0.014045126130978596\n }\n}\n```", "repo_url": "https://huggingface.co/SebastianSchramm/Cerebras-GPT-111M-instruction", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T16_31_53.265956", "path": ["**/details_harness|drop|3_2023-10-24T16-31-53.265956.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T16-31-53.265956.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T16_31_53.265956", "path": ["**/details_harness|gsm8k|5_2023-10-24T16-31-53.265956.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T16-31-53.265956.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:50:00.639660.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:50:00.639660.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T16_31_53.265956", "path": ["**/details_harness|winogrande|5_2023-10-24T16-31-53.265956.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T16-31-53.265956.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_50_00.639660", "path": ["results_2023-07-19T13:50:00.639660.parquet"]}, {"split": "2023_10_24T16_31_53.265956", "path": ["results_2023-10-24T16-31-53.265956.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T16-31-53.265956.parquet"]}]}]}
|
2023-10-24T15:32:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model SebastianSchramm/Cerebras-GPT-111M-instruction on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T16:31:53.265956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SebastianSchramm/Cerebras-GPT-111M-instruction on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T16:31:53.265956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model SebastianSchramm/Cerebras-GPT-111M-instruction on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T16:31:53.265956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of SebastianSchramm/Cerebras-GPT-111M-instruction## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model SebastianSchramm/Cerebras-GPT-111M-instruction on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T16:31:53.265956(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
52ab935cd8137fb0a3f529dca255bf3ca7b05641
|
# Dataset Card for Evaluation run of julianweng/Llama-2-7b-chat-orcah
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/julianweng/Llama-2-7b-chat-orcah
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [julianweng/Llama-2-7b-chat-orcah](https://huggingface.co/julianweng/Llama-2-7b-chat-orcah) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T17:33:03.536328](https://huggingface.co/datasets/open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah/blob/main/results_2023-09-17T17-33-03.536328.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02936241610738255,
"em_stderr": 0.0017288770032803159,
"f1": 0.07552432885906037,
"f1_stderr": 0.0020587215501161925,
"acc": 0.3737288120380116,
"acc_stderr": 0.00900957367793152
},
"harness|drop|3": {
"em": 0.02936241610738255,
"em_stderr": 0.0017288770032803159,
"f1": 0.07552432885906037,
"f1_stderr": 0.0020587215501161925
},
"harness|gsm8k|5": {
"acc": 0.03790750568612585,
"acc_stderr": 0.005260333907798431
},
"harness|winogrande|5": {
"acc": 0.7095501183898973,
"acc_stderr": 0.01275881344806461
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah
|
[
"region:us"
] |
2023-08-18T10:19:00+00:00
|
{"pretty_name": "Evaluation run of julianweng/Llama-2-7b-chat-orcah", "dataset_summary": "Dataset automatically created during the evaluation run of model [julianweng/Llama-2-7b-chat-orcah](https://huggingface.co/julianweng/Llama-2-7b-chat-orcah) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T17:33:03.536328](https://huggingface.co/datasets/open-llm-leaderboard/details_julianweng__Llama-2-7b-chat-orcah/blob/main/results_2023-09-17T17-33-03.536328.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02936241610738255,\n \"em_stderr\": 0.0017288770032803159,\n \"f1\": 0.07552432885906037,\n \"f1_stderr\": 0.0020587215501161925,\n \"acc\": 0.3737288120380116,\n \"acc_stderr\": 0.00900957367793152\n },\n \"harness|drop|3\": {\n \"em\": 0.02936241610738255,\n \"em_stderr\": 0.0017288770032803159,\n \"f1\": 0.07552432885906037,\n \"f1_stderr\": 0.0020587215501161925\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03790750568612585,\n \"acc_stderr\": 0.005260333907798431\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7095501183898973,\n \"acc_stderr\": 0.01275881344806461\n }\n}\n```", "repo_url": "https://huggingface.co/julianweng/Llama-2-7b-chat-orcah", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T17_33_03.536328", "path": ["**/details_harness|drop|3_2023-09-17T17-33-03.536328.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T17-33-03.536328.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T17_33_03.536328", "path": ["**/details_harness|gsm8k|5_2023-09-17T17-33-03.536328.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T17-33-03.536328.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:44:40.236710.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:44:40.236710.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T17_33_03.536328", "path": ["**/details_harness|winogrande|5_2023-09-17T17-33-03.536328.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T17-33-03.536328.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_44_40.236710", "path": ["results_2023-07-24T11:44:40.236710.parquet"]}, {"split": "2023_09_17T17_33_03.536328", "path": ["results_2023-09-17T17-33-03.536328.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T17-33-03.536328.parquet"]}]}]}
|
2023-09-17T16:33:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of julianweng/Llama-2-7b-chat-orcah
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model julianweng/Llama-2-7b-chat-orcah on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T17:33:03.536328(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of julianweng/Llama-2-7b-chat-orcah",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model julianweng/Llama-2-7b-chat-orcah on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T17:33:03.536328(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of julianweng/Llama-2-7b-chat-orcah",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model julianweng/Llama-2-7b-chat-orcah on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T17:33:03.536328(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of julianweng/Llama-2-7b-chat-orcah## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model julianweng/Llama-2-7b-chat-orcah on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T17:33:03.536328(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d2a2072fe12925e3f487fd1f55f73a0e5e5d54f5
|
# Dataset Card for Evaluation run of kingbri/airolima-chronos-grad-l2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kingbri/airolima-chronos-grad-l2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [kingbri/airolima-chronos-grad-l2-13B](https://huggingface.co/kingbri/airolima-chronos-grad-l2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kingbri__airolima-chronos-grad-l2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T05:59:59.721440](https://huggingface.co/datasets/open-llm-leaderboard/details_kingbri__airolima-chronos-grad-l2-13B/blob/main/results_2023-10-15T05-59-59.721440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.013213087248322148,
"em_stderr": 0.0011693741608321197,
"f1": 0.07846791107382547,
"f1_stderr": 0.0017929893502969876,
"acc": 0.44747581489169586,
"acc_stderr": 0.010742362890413708
},
"harness|drop|3": {
"em": 0.013213087248322148,
"em_stderr": 0.0011693741608321197,
"f1": 0.07846791107382547,
"f1_stderr": 0.0017929893502969876
},
"harness|gsm8k|5": {
"acc": 0.13646702047005307,
"acc_stderr": 0.00945574199881554
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011875
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_kingbri__airolima-chronos-grad-l2-13B
|
[
"region:us"
] |
2023-08-18T10:19:08+00:00
|
{"pretty_name": "Evaluation run of kingbri/airolima-chronos-grad-l2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [kingbri/airolima-chronos-grad-l2-13B](https://huggingface.co/kingbri/airolima-chronos-grad-l2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kingbri__airolima-chronos-grad-l2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T05:59:59.721440](https://huggingface.co/datasets/open-llm-leaderboard/details_kingbri__airolima-chronos-grad-l2-13B/blob/main/results_2023-10-15T05-59-59.721440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.013213087248322148,\n \"em_stderr\": 0.0011693741608321197,\n \"f1\": 0.07846791107382547,\n \"f1_stderr\": 0.0017929893502969876,\n \"acc\": 0.44747581489169586,\n \"acc_stderr\": 0.010742362890413708\n },\n \"harness|drop|3\": {\n \"em\": 0.013213087248322148,\n \"em_stderr\": 0.0011693741608321197,\n \"f1\": 0.07846791107382547,\n \"f1_stderr\": 0.0017929893502969876\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13646702047005307,\n \"acc_stderr\": 0.00945574199881554\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011875\n }\n}\n```", "repo_url": "https://huggingface.co/kingbri/airolima-chronos-grad-l2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T05_59_59.721440", "path": ["**/details_harness|drop|3_2023-10-15T05-59-59.721440.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T05-59-59.721440.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T05_59_59.721440", "path": ["**/details_harness|gsm8k|5_2023-10-15T05-59-59.721440.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T05-59-59.721440.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:49:08.854664.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:49:08.854664.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:49:08.854664.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T05_59_59.721440", "path": ["**/details_harness|winogrande|5_2023-10-15T05-59-59.721440.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T05-59-59.721440.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_49_08.854664", "path": ["results_2023-08-09T11:49:08.854664.parquet"]}, {"split": "2023_10_15T05_59_59.721440", "path": ["results_2023-10-15T05-59-59.721440.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T05-59-59.721440.parquet"]}]}]}
|
2023-10-15T05:00:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of kingbri/airolima-chronos-grad-l2-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kingbri/airolima-chronos-grad-l2-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T05:59:59.721440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of kingbri/airolima-chronos-grad-l2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kingbri/airolima-chronos-grad-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T05:59:59.721440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kingbri/airolima-chronos-grad-l2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kingbri/airolima-chronos-grad-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T05:59:59.721440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kingbri/airolima-chronos-grad-l2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kingbri/airolima-chronos-grad-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T05:59:59.721440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
78507597bfb6704916a5ce680617c487d6b4e2a2
|
# Dataset Card for Evaluation run of kingbri/chronolima-airo-grad-l2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [kingbri/chronolima-airo-grad-l2-13B](https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T00:13:00.101023](https://huggingface.co/datasets/open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B/blob/main/results_2023-09-18T00-13-00.101023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.018770973154362415,
"em_stderr": 0.0013898509848031188,
"f1": 0.08510381711409391,
"f1_stderr": 0.001943348962771241,
"acc": 0.4478082161451867,
"acc_stderr": 0.010806174983049747
},
"harness|drop|3": {
"em": 0.018770973154362415,
"em_stderr": 0.0013898509848031188,
"f1": 0.08510381711409391,
"f1_stderr": 0.001943348962771241
},
"harness|gsm8k|5": {
"acc": 0.13949962092494314,
"acc_stderr": 0.0095434266871913
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.012068923278908194
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B
|
[
"region:us"
] |
2023-08-18T10:19:19+00:00
|
{"pretty_name": "Evaluation run of kingbri/chronolima-airo-grad-l2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [kingbri/chronolima-airo-grad-l2-13B](https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T00:13:00.101023](https://huggingface.co/datasets/open-llm-leaderboard/details_kingbri__chronolima-airo-grad-l2-13B/blob/main/results_2023-09-18T00-13-00.101023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018770973154362415,\n \"em_stderr\": 0.0013898509848031188,\n \"f1\": 0.08510381711409391,\n \"f1_stderr\": 0.001943348962771241,\n \"acc\": 0.4478082161451867,\n \"acc_stderr\": 0.010806174983049747\n },\n \"harness|drop|3\": {\n \"em\": 0.018770973154362415,\n \"em_stderr\": 0.0013898509848031188,\n \"f1\": 0.08510381711409391,\n \"f1_stderr\": 0.001943348962771241\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13949962092494314,\n \"acc_stderr\": 0.0095434266871913\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.012068923278908194\n }\n}\n```", "repo_url": "https://huggingface.co/kingbri/chronolima-airo-grad-l2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T00_13_00.101023", "path": ["**/details_harness|drop|3_2023-09-18T00-13-00.101023.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T00-13-00.101023.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T00_13_00.101023", "path": ["**/details_harness|gsm8k|5_2023-09-18T00-13-00.101023.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T00-13-00.101023.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:57:29.540366.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:57:29.540366.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T00_13_00.101023", "path": ["**/details_harness|winogrande|5_2023-09-18T00-13-00.101023.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T00-13-00.101023.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T11_57_29.540366", "path": ["results_2023-08-09T11:57:29.540366.parquet"]}, {"split": "2023_09_18T00_13_00.101023", "path": ["results_2023-09-18T00-13-00.101023.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T00-13-00.101023.parquet"]}]}]}
|
2023-09-17T23:13:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of kingbri/chronolima-airo-grad-l2-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model kingbri/chronolima-airo-grad-l2-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-18T00:13:00.101023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of kingbri/chronolima-airo-grad-l2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kingbri/chronolima-airo-grad-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T00:13:00.101023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of kingbri/chronolima-airo-grad-l2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model kingbri/chronolima-airo-grad-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T00:13:00.101023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of kingbri/chronolima-airo-grad-l2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model kingbri/chronolima-airo-grad-l2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T00:13:00.101023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e96a3362fce28e1f72c31ab47b1be5474668d3a6
|
# Dataset Card for Evaluation run of junelee/wizard-vicuna-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/junelee/wizard-vicuna-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [junelee/wizard-vicuna-13b](https://huggingface.co/junelee/wizard-vicuna-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_junelee__wizard-vicuna-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T14:38:37.071272](https://huggingface.co/datasets/open-llm-leaderboard/details_junelee__wizard-vicuna-13b/blob/main/results_2023-10-29T14-38-37.071272.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03544463087248322,
"em_stderr": 0.0018935573437954085,
"f1": 0.10134962248322173,
"f1_stderr": 0.002317193152000018,
"acc": 0.4199801576497466,
"acc_stderr": 0.010074715624299561
},
"harness|drop|3": {
"em": 0.03544463087248322,
"em_stderr": 0.0018935573437954085,
"f1": 0.10134962248322173,
"f1_stderr": 0.002317193152000018
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339338
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_junelee__wizard-vicuna-13b
|
[
"region:us"
] |
2023-08-18T10:19:27+00:00
|
{"pretty_name": "Evaluation run of junelee/wizard-vicuna-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [junelee/wizard-vicuna-13b](https://huggingface.co/junelee/wizard-vicuna-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_junelee__wizard-vicuna-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T14:38:37.071272](https://huggingface.co/datasets/open-llm-leaderboard/details_junelee__wizard-vicuna-13b/blob/main/results_2023-10-29T14-38-37.071272.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03544463087248322,\n \"em_stderr\": 0.0018935573437954085,\n \"f1\": 0.10134962248322173,\n \"f1_stderr\": 0.002317193152000018,\n \"acc\": 0.4199801576497466,\n \"acc_stderr\": 0.010074715624299561\n },\n \"harness|drop|3\": {\n \"em\": 0.03544463087248322,\n \"em_stderr\": 0.0018935573437954085,\n \"f1\": 0.10134962248322173,\n \"f1_stderr\": 0.002317193152000018\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \"acc_stderr\": 0.007950942148339338\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259785\n }\n}\n```", "repo_url": "https://huggingface.co/junelee/wizard-vicuna-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T14_38_37.071272", "path": ["**/details_harness|drop|3_2023-10-29T14-38-37.071272.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T14-38-37.071272.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T14_38_37.071272", "path": ["**/details_harness|gsm8k|5_2023-10-29T14-38-37.071272.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T14-38-37.071272.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T14:12:52.847326.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:12:52.847326.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T14:12:52.847326.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T14_38_37.071272", "path": ["**/details_harness|winogrande|5_2023-10-29T14-38-37.071272.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T14-38-37.071272.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T14_12_52.847326", "path": ["results_2023-07-18T14:12:52.847326.parquet"]}, {"split": "2023_10_29T14_38_37.071272", "path": ["results_2023-10-29T14-38-37.071272.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T14-38-37.071272.parquet"]}]}]}
|
2023-10-29T14:38:49+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of junelee/wizard-vicuna-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model junelee/wizard-vicuna-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-29T14:38:37.071272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of junelee/wizard-vicuna-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model junelee/wizard-vicuna-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T14:38:37.071272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of junelee/wizard-vicuna-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model junelee/wizard-vicuna-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T14:38:37.071272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of junelee/wizard-vicuna-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model junelee/wizard-vicuna-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T14:38:37.071272(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c7b9b15df6841376972c444db85c4e13407efb7a
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T21:38:34.169396](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.3/blob/main/results_2023-10-21T21-38-34.169396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.09416946308724833,
"em_stderr": 0.002991012149237377,
"f1": 0.14642302852348915,
"f1_stderr": 0.00310583043009117,
"acc": 0.3664117957865523,
"acc_stderr": 0.00764948379619204
},
"harness|drop|3": {
"em": 0.09416946308724833,
"em_stderr": 0.002991012149237377,
"f1": 0.14642302852348915,
"f1_stderr": 0.00310583043009117
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416625
},
"harness|winogrande|5": {
"acc": 0.7229676400947119,
"acc_stderr": 0.012577891015342417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.3
|
[
"region:us"
] |
2023-08-18T10:19:36+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-7b-gpt4-1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T21:38:34.169396](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.3/blob/main/results_2023-10-21T21-38-34.169396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.09416946308724833,\n \"em_stderr\": 0.002991012149237377,\n \"f1\": 0.14642302852348915,\n \"f1_stderr\": 0.00310583043009117,\n \"acc\": 0.3664117957865523,\n \"acc_stderr\": 0.00764948379619204\n },\n \"harness|drop|3\": {\n \"em\": 0.09416946308724833,\n \"em_stderr\": 0.002991012149237377,\n \"f1\": 0.14642302852348915,\n \"f1_stderr\": 0.00310583043009117\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.0027210765770416625\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7229676400947119,\n \"acc_stderr\": 0.012577891015342417\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|arc:challenge|25_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T21_38_34.169396", "path": ["**/details_harness|drop|3_2023-10-21T21-38-34.169396.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T21-38-34.169396.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T21_38_34.169396", "path": ["**/details_harness|gsm8k|5_2023-10-21T21-38-34.169396.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T21-38-34.169396.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hellaswag|10_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T14:14:54.874826.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T14:14:54.874826.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T14:14:54.874826.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T21_38_34.169396", "path": ["**/details_harness|winogrande|5_2023-10-21T21-38-34.169396.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T21-38-34.169396.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T14_14_54.874826", "path": ["results_2023-07-31T14:14:54.874826.parquet"]}, {"split": "2023_10_21T21_38_34.169396", "path": ["results_2023-10-21T21-38-34.169396.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T21-38-34.169396.parquet"]}]}]}
|
2023-10-21T20:38:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.3 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T21:38:34.169396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T21:38:34.169396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T21:38:34.169396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T21:38:34.169396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e14d19576d5704fdcea429ca7ef1ae54e67c777c
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b](https://huggingface.co/jondurbin/airoboros-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T18:06:24.676047](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b/blob/main/results_2023-10-22T18-06-24.676047.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.17260906040268456,
"em_stderr": 0.0038701413394570546,
"f1": 0.2366065436241609,
"f1_stderr": 0.003924713658600719,
"acc": 0.3653891607870639,
"acc_stderr": 0.00836463128895662
},
"harness|drop|3": {
"em": 0.17260906040268456,
"em_stderr": 0.0038701413394570546,
"f1": 0.2366065436241609,
"f1_stderr": 0.003924713658600719
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
},
"harness|winogrande|5": {
"acc": 0.7095501183898973,
"acc_stderr": 0.012758813448064605
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-7b
|
[
"region:us"
] |
2023-08-18T10:19:44+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b](https://huggingface.co/jondurbin/airoboros-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T18:06:24.676047](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b/blob/main/results_2023-10-22T18-06-24.676047.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.17260906040268456,\n \"em_stderr\": 0.0038701413394570546,\n \"f1\": 0.2366065436241609,\n \"f1_stderr\": 0.003924713658600719,\n \"acc\": 0.3653891607870639,\n \"acc_stderr\": 0.00836463128895662\n },\n \"harness|drop|3\": {\n \"em\": 0.17260906040268456,\n \"em_stderr\": 0.0038701413394570546,\n \"f1\": 0.2366065436241609,\n \"f1_stderr\": 0.003924713658600719\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \"acc_stderr\": 0.003970449129848635\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7095501183898973,\n \"acc_stderr\": 0.012758813448064605\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T14_45_03.527749", "path": ["**/details_harness|drop|3_2023-10-22T14-45-03.527749.parquet"]}, {"split": "2023_10_22T18_06_24.676047", "path": ["**/details_harness|drop|3_2023-10-22T18-06-24.676047.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T18-06-24.676047.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T14_45_03.527749", "path": ["**/details_harness|gsm8k|5_2023-10-22T14-45-03.527749.parquet"]}, {"split": "2023_10_22T18_06_24.676047", "path": ["**/details_harness|gsm8k|5_2023-10-22T18-06-24.676047.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T18-06-24.676047.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:41.797707.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:53:16.079239.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:53:16.079239.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T14_45_03.527749", "path": ["**/details_harness|winogrande|5_2023-10-22T14-45-03.527749.parquet"]}, {"split": "2023_10_22T18_06_24.676047", "path": ["**/details_harness|winogrande|5_2023-10-22T18-06-24.676047.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T18-06-24.676047.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_47_41.797707", "path": ["results_2023-07-19T16:47:41.797707.parquet"]}, {"split": "2023_08_03T10_53_16.079239", "path": ["results_2023-08-03T10:53:16.079239.parquet"]}, {"split": "2023_10_22T14_45_03.527749", "path": ["results_2023-10-22T14-45-03.527749.parquet"]}, {"split": "2023_10_22T18_06_24.676047", "path": ["results_2023-10-22T18-06-24.676047.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T18-06-24.676047.parquet"]}]}]}
|
2023-10-22T17:06:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T18:06:24.676047(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T18:06:24.676047(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T18:06:24.676047(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T18:06:24.676047(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
936e43edb5dfae1b10751bb1c5a301feaa1543f6
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T19:03:13.374959](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1/blob/main/results_2023-10-22T19-03-13.374959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965807,
"f1": 0.07133494127516771,
"f1_stderr": 0.0015039896976380969,
"acc": 0.40148895416572666,
"acc_stderr": 0.00972321783657909
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965807,
"f1": 0.07133494127516771,
"f1_stderr": 0.0015039896976380969
},
"harness|gsm8k|5": {
"acc": 0.06974981046247157,
"acc_stderr": 0.00701638957101385
},
"harness|winogrande|5": {
"acc": 0.7332280978689818,
"acc_stderr": 0.012430046102144331
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1
|
[
"region:us"
] |
2023-08-18T10:20:01+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-13b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T19:03:13.374959](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-13b-gpt4-1.4.1/blob/main/results_2023-10-22T19-03-13.374959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965807,\n \"f1\": 0.07133494127516771,\n \"f1_stderr\": 0.0015039896976380969,\n \"acc\": 0.40148895416572666,\n \"acc_stderr\": 0.00972321783657909\n },\n \"harness|drop|3\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965807,\n \"f1\": 0.07133494127516771,\n \"f1_stderr\": 0.0015039896976380969\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06974981046247157,\n \"acc_stderr\": 0.00701638957101385\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7332280978689818,\n \"acc_stderr\": 0.012430046102144331\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-l2-13b-gpt4-1.4.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T19_03_13.374959", "path": ["**/details_harness|drop|3_2023-10-22T19-03-13.374959.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T19-03-13.374959.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T19_03_13.374959", "path": ["**/details_harness|gsm8k|5_2023-10-22T19-03-13.374959.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T19-03-13.374959.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:17:43.655120.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:17:43.655120.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T19_03_13.374959", "path": ["**/details_harness|winogrande|5_2023-10-22T19-03-13.374959.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T19-03-13.374959.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T15_17_43.655120", "path": ["results_2023-07-24T15:17:43.655120.parquet"]}, {"split": "2023_10_22T19_03_13.374959", "path": ["results_2023-10-22T19-03-13.374959.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T19-03-13.374959.parquet"]}]}]}
|
2023-10-22T18:03:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-l2-13b-gpt4-1.4.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T19:03:13.374959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-13b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T19:03:13.374959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-13b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T19:03:13.374959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-l2-13b-gpt4-1.4.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-13b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T19:03:13.374959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a6e5311366d86464672844cb1b1a60d55a172b06
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T23:35:00.134396](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.4/blob/main/results_2023-10-22T23-35-00.134396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.008074664429530202,
"em_stderr": 0.0009165188135511226,
"f1": 0.07536493288590619,
"f1_stderr": 0.0016308549138484363,
"acc": 0.4482593962444343,
"acc_stderr": 0.010265777447066696
},
"harness|drop|3": {
"em": 0.008074664429530202,
"em_stderr": 0.0009165188135511226,
"f1": 0.07536493288590619,
"f1_stderr": 0.0016308549138484363
},
"harness|gsm8k|5": {
"acc": 0.11751326762699014,
"acc_stderr": 0.008870331256489982
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643412
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.4
|
[
"region:us"
] |
2023-08-18T10:20:09+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-33b-gpt4-1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T23:35:00.134396](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.4/blob/main/results_2023-10-22T23-35-00.134396.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008074664429530202,\n \"em_stderr\": 0.0009165188135511226,\n \"f1\": 0.07536493288590619,\n \"f1_stderr\": 0.0016308549138484363,\n \"acc\": 0.4482593962444343,\n \"acc_stderr\": 0.010265777447066696\n },\n \"harness|drop|3\": {\n \"em\": 0.008074664429530202,\n \"em_stderr\": 0.0009165188135511226,\n \"f1\": 0.07536493288590619,\n \"f1_stderr\": 0.0016308549138484363\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.11751326762699014,\n \"acc_stderr\": 0.008870331256489982\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643412\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|arc:challenge|25_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T20_41_55.992984", "path": ["**/details_harness|drop|3_2023-10-22T20-41-55.992984.parquet"]}, {"split": "2023_10_22T23_35_00.134396", "path": ["**/details_harness|drop|3_2023-10-22T23-35-00.134396.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T23-35-00.134396.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T20_41_55.992984", "path": ["**/details_harness|gsm8k|5_2023-10-22T20-41-55.992984.parquet"]}, {"split": "2023_10_22T23_35_00.134396", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-35-00.134396.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-35-00.134396.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hellaswag|10_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T12:50:28.372166.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T12:50:28.372166.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T12:50:28.372166.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T20_41_55.992984", "path": ["**/details_harness|winogrande|5_2023-10-22T20-41-55.992984.parquet"]}, {"split": "2023_10_22T23_35_00.134396", "path": ["**/details_harness|winogrande|5_2023-10-22T23-35-00.134396.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T23-35-00.134396.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T12_50_28.372166", "path": ["results_2023-07-31T12:50:28.372166.parquet"]}, {"split": "2023_10_22T20_41_55.992984", "path": ["results_2023-10-22T20-41-55.992984.parquet"]}, {"split": "2023_10_22T23_35_00.134396", "path": ["results_2023-10-22T23-35-00.134396.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T23-35-00.134396.parquet"]}]}]}
|
2023-10-22T22:35:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T23:35:00.134396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T23:35:00.134396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T23:35:00.134396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T23:35:00.134396(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a2b0927a752cc3d6d8fe05186aed247b7e2ffe7a
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-gpt4-2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-33b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T18:00:44.135604](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-2.0/blob/main/results_2023-10-22T18-00-44.135604.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2613255033557047,
"em_stderr": 0.004499425875530518,
"f1": 0.34639366610738487,
"f1_stderr": 0.0044375441149257415,
"acc": 0.4429523454483767,
"acc_stderr": 0.010086103101581952
},
"harness|drop|3": {
"em": 0.2613255033557047,
"em_stderr": 0.004499425875530518,
"f1": 0.34639366610738487,
"f1_stderr": 0.0044375441149257415
},
"harness|gsm8k|5": {
"acc": 0.1068991660348749,
"acc_stderr": 0.008510982565520494
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.01166122363764341
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-2.0
|
[
"region:us"
] |
2023-08-18T10:20:18+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-33b-gpt4-2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-33b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T18:00:44.135604](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-2.0/blob/main/results_2023-10-22T18-00-44.135604.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2613255033557047,\n \"em_stderr\": 0.004499425875530518,\n \"f1\": 0.34639366610738487,\n \"f1_stderr\": 0.0044375441149257415,\n \"acc\": 0.4429523454483767,\n \"acc_stderr\": 0.010086103101581952\n },\n \"harness|drop|3\": {\n \"em\": 0.2613255033557047,\n \"em_stderr\": 0.004499425875530518,\n \"f1\": 0.34639366610738487,\n \"f1_stderr\": 0.0044375441149257415\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \"acc_stderr\": 0.008510982565520494\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.01166122363764341\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-33b-gpt4-2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|arc:challenge|25_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T09_25_31.995205", "path": ["**/details_harness|drop|3_2023-10-18T09-25-31.995205.parquet"]}, {"split": "2023_10_22T18_00_44.135604", "path": ["**/details_harness|drop|3_2023-10-22T18-00-44.135604.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T18-00-44.135604.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T09_25_31.995205", "path": ["**/details_harness|gsm8k|5_2023-10-18T09-25-31.995205.parquet"]}, {"split": "2023_10_22T18_00_44.135604", "path": ["**/details_harness|gsm8k|5_2023-10-22T18-00-44.135604.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T18-00-44.135604.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hellaswag|10_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:45:43.401696.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T12:21:37.094883.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T12:21:37.094883.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T12:21:37.094883.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T09_25_31.995205", "path": ["**/details_harness|winogrande|5_2023-10-18T09-25-31.995205.parquet"]}, {"split": "2023_10_22T18_00_44.135604", "path": ["**/details_harness|winogrande|5_2023-10-22T18-00-44.135604.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T18-00-44.135604.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_03T10_45_43.401696", "path": ["results_2023-08-03T10:45:43.401696.parquet"]}, {"split": "2023_08_17T12_21_37.094883", "path": ["results_2023-08-17T12:21:37.094883.parquet"]}, {"split": "2023_10_18T09_25_31.995205", "path": ["results_2023-10-18T09-25-31.995205.parquet"]}, {"split": "2023_10_22T18_00_44.135604", "path": ["results_2023-10-22T18-00-44.135604.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T18-00-44.135604.parquet"]}]}]}
|
2023-10-22T17:00:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-2.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-2.0 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T18:00:44.135604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T18:00:44.135604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T18:00:44.135604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-2.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T18:00:44.135604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e8db12ca591063d79aab9009036899aa94ed409f
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b-gpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4](https://huggingface.co/jondurbin/airoboros-7b-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T06:51:15.368874](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4/blob/main/results_2023-10-22T06-51-15.368874.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24276426174496643,
"em_stderr": 0.004390839668047224,
"f1": 0.3038569630872493,
"f1_stderr": 0.004387376487144696,
"acc": 0.37414887626834564,
"acc_stderr": 0.008035199409633497
},
"harness|drop|3": {
"em": 0.24276426174496643,
"em_stderr": 0.004390839668047224,
"f1": 0.3038569630872493,
"f1_stderr": 0.004387376487144696
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.0036054868679982572
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4
|
[
"region:us"
] |
2023-08-18T10:20:36+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-7b-gpt4", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4](https://huggingface.co/jondurbin/airoboros-7b-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T06:51:15.368874](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4/blob/main/results_2023-10-22T06-51-15.368874.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24276426174496643,\n \"em_stderr\": 0.004390839668047224,\n \"f1\": 0.3038569630872493,\n \"f1_stderr\": 0.004387376487144696,\n \"acc\": 0.37414887626834564,\n \"acc_stderr\": 0.008035199409633497\n },\n \"harness|drop|3\": {\n \"em\": 0.24276426174496643,\n \"em_stderr\": 0.004390839668047224,\n \"f1\": 0.3038569630872493,\n \"f1_stderr\": 0.004387376487144696\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.0036054868679982572\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268738\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-7b-gpt4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|arc:challenge|25_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T06_51_15.368874", "path": ["**/details_harness|drop|3_2023-10-22T06-51-15.368874.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T06-51-15.368874.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T06_51_15.368874", "path": ["**/details_harness|gsm8k|5_2023-10-22T06-51-15.368874.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T06-51-15.368874.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hellaswag|10_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T14:10:25.763486.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T14:10:25.763486.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T14:10:25.763486.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T06_51_15.368874", "path": ["**/details_harness|winogrande|5_2023-10-22T06-51-15.368874.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T06-51-15.368874.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T14_10_25.763486", "path": ["results_2023-07-31T14:10:25.763486.parquet"]}, {"split": "2023_10_22T06_51_15.368874", "path": ["results_2023-10-22T06-51-15.368874.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T06-51-15.368874.parquet"]}]}]}
|
2023-10-22T05:51:28+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T06:51:15.368874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T06:51:15.368874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T06:51:15.368874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T06:51:15.368874(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
6f50d03699d82fb1c6e40fd504480d1a44510c76
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T11:21:20.910365](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4/blob/main/results_2023-10-16T11-21-20.910365.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04393875838926174,
"em_stderr": 0.0020989708043196364,
"f1": 0.10162122483221474,
"f1_stderr": 0.002382351530884103,
"acc": 0.38242629578146603,
"acc_stderr": 0.008860107137263845
},
"harness|drop|3": {
"em": 0.04393875838926174,
"em_stderr": 0.0020989708043196364,
"f1": 0.10162122483221474,
"f1_stderr": 0.002382351530884103
},
"harness|gsm8k|5": {
"acc": 0.037149355572403335,
"acc_stderr": 0.0052095162830737545
},
"harness|winogrande|5": {
"acc": 0.7277032359905288,
"acc_stderr": 0.012510697991453934
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4
|
[
"region:us"
] |
2023-08-18T10:20:45+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-7b-gpt4-1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T11:21:20.910365](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4/blob/main/results_2023-10-16T11-21-20.910365.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04393875838926174,\n \"em_stderr\": 0.0020989708043196364,\n \"f1\": 0.10162122483221474,\n \"f1_stderr\": 0.002382351530884103,\n \"acc\": 0.38242629578146603,\n \"acc_stderr\": 0.008860107137263845\n },\n \"harness|drop|3\": {\n \"em\": 0.04393875838926174,\n \"em_stderr\": 0.0020989708043196364,\n \"f1\": 0.10162122483221474,\n \"f1_stderr\": 0.002382351530884103\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.037149355572403335,\n \"acc_stderr\": 0.0052095162830737545\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7277032359905288,\n \"acc_stderr\": 0.012510697991453934\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|arc:challenge|25_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T11_21_20.910365", "path": ["**/details_harness|drop|3_2023-10-16T11-21-20.910365.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T11-21-20.910365.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T11_21_20.910365", "path": ["**/details_harness|gsm8k|5_2023-10-16T11-21-20.910365.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T11-21-20.910365.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hellaswag|10_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T12:01:20.647029.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:50:49.881467.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:50:49.881467.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:50:49.881467.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T11_21_20.910365", "path": ["**/details_harness|winogrande|5_2023-10-16T11-21-20.910365.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T11-21-20.910365.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T12_01_20.647029", "path": ["results_2023-07-24T12:01:20.647029.parquet"]}, {"split": "2023_08_03T10_50_49.881467", "path": ["results_2023-08-03T10:50:49.881467.parquet"]}, {"split": "2023_10_16T11_21_20.910365", "path": ["results_2023-10-16T11-21-20.910365.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T11-21-20.910365.parquet"]}]}]}
|
2023-10-16T10:21:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T11:21:20.910365(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T11:21:20.910365(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T11:21:20.910365(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T11:21:20.910365(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e800085342f575c6b1b67709a9ca9a94a0a56c72
|
# Dataset Card for Evaluation run of jondurbin/airoboros-gpt-3.5-turbo-100k-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-gpt-3.5-turbo-100k-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-gpt-3.5-turbo-100k-7b](https://huggingface.co/jondurbin/airoboros-gpt-3.5-turbo-100k-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-gpt-3.5-turbo-100k-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T13:27:38.857487](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-gpt-3.5-turbo-100k-7b/blob/main/results_2023-10-18T13-27-38.857487.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04236577181208054,
"em_stderr": 0.0020627521235254487,
"f1": 0.12625943791946267,
"f1_stderr": 0.002580178700995439,
"acc": 0.3718023208847917,
"acc_stderr": 0.008942653172749105
},
"harness|drop|3": {
"em": 0.04236577181208054,
"em_stderr": 0.0020627521235254487,
"f1": 0.12625943791946267,
"f1_stderr": 0.002580178700995439
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.005106107853744191
},
"harness|winogrande|5": {
"acc": 0.7079715864246251,
"acc_stderr": 0.012779198491754018
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-gpt-3.5-turbo-100k-7b
|
[
"region:us"
] |
2023-08-18T10:20:56+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-gpt-3.5-turbo-100k-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-gpt-3.5-turbo-100k-7b](https://huggingface.co/jondurbin/airoboros-gpt-3.5-turbo-100k-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-gpt-3.5-turbo-100k-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T13:27:38.857487](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-gpt-3.5-turbo-100k-7b/blob/main/results_2023-10-18T13-27-38.857487.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04236577181208054,\n \"em_stderr\": 0.0020627521235254487,\n \"f1\": 0.12625943791946267,\n \"f1_stderr\": 0.002580178700995439,\n \"acc\": 0.3718023208847917,\n \"acc_stderr\": 0.008942653172749105\n },\n \"harness|drop|3\": {\n \"em\": 0.04236577181208054,\n \"em_stderr\": 0.0020627521235254487,\n \"f1\": 0.12625943791946267,\n \"f1_stderr\": 0.002580178700995439\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \"acc_stderr\": 0.005106107853744191\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7079715864246251,\n \"acc_stderr\": 0.012779198491754018\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-gpt-3.5-turbo-100k-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T13_27_38.857487", "path": ["**/details_harness|drop|3_2023-10-18T13-27-38.857487.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T13-27-38.857487.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T13_27_38.857487", "path": ["**/details_harness|gsm8k|5_2023-10-18T13-27-38.857487.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T13-27-38.857487.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:51:43.635262.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:51:43.635262.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:51:43.635262.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T13_27_38.857487", "path": ["**/details_harness|winogrande|5_2023-10-18T13-27-38.857487.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T13-27-38.857487.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_03T10_51_43.635262", "path": ["results_2023-08-03T10:51:43.635262.parquet"]}, {"split": "2023_10_18T13_27_38.857487", "path": ["results_2023-10-18T13-27-38.857487.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T13-27-38.857487.parquet"]}]}]}
|
2023-10-18T12:27:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-gpt-3.5-turbo-100k-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-gpt-3.5-turbo-100k-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T13:27:38.857487(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-gpt-3.5-turbo-100k-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-gpt-3.5-turbo-100k-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T13:27:38.857487(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-gpt-3.5-turbo-100k-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-gpt-3.5-turbo-100k-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T13:27:38.857487(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-gpt-3.5-turbo-100k-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-gpt-3.5-turbo-100k-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T13:27:38.857487(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
cfeeee280d4bd205cff566113a2bf364d8778e7d
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T02:48:34.723506](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4/blob/main/results_2023-10-23T02-48-34.723506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05285234899328859,
"em_stderr": 0.0022912930700355423,
"f1": 0.11820364932885902,
"f1_stderr": 0.0026017641356238645,
"acc": 0.41988112541310807,
"acc_stderr": 0.009659506214512746
},
"harness|drop|3": {
"em": 0.05285234899328859,
"em_stderr": 0.0022912930700355423,
"f1": 0.11820364932885902,
"f1_stderr": 0.0026017641356238645
},
"harness|gsm8k|5": {
"acc": 0.07733131159969674,
"acc_stderr": 0.007357713523222348
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803143
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4
|
[
"region:us"
] |
2023-08-18T10:21:04+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-13b-gpt4-1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:48:34.723506](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4/blob/main/results_2023-10-23T02-48-34.723506.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05285234899328859,\n \"em_stderr\": 0.0022912930700355423,\n \"f1\": 0.11820364932885902,\n \"f1_stderr\": 0.0026017641356238645,\n \"acc\": 0.41988112541310807,\n \"acc_stderr\": 0.009659506214512746\n },\n \"harness|drop|3\": {\n \"em\": 0.05285234899328859,\n \"em_stderr\": 0.0022912930700355423,\n \"f1\": 0.11820364932885902,\n \"f1_stderr\": 0.0026017641356238645\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \"acc_stderr\": 0.007357713523222348\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803143\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T16_14_52.979927", "path": ["**/details_harness|drop|3_2023-10-22T16-14-52.979927.parquet"]}, {"split": "2023_10_23T02_48_34.723506", "path": ["**/details_harness|drop|3_2023-10-23T02-48-34.723506.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-48-34.723506.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T16_14_52.979927", "path": ["**/details_harness|gsm8k|5_2023-10-22T16-14-52.979927.parquet"]}, {"split": "2023_10_23T02_48_34.723506", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-48-34.723506.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-48-34.723506.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:58.077469.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:26:58.077469.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T16_14_52.979927", "path": ["**/details_harness|winogrande|5_2023-10-22T16-14-52.979927.parquet"]}, {"split": "2023_10_23T02_48_34.723506", "path": ["**/details_harness|winogrande|5_2023-10-23T02-48-34.723506.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-48-34.723506.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_26_58.077469", "path": ["results_2023-07-19T18:26:58.077469.parquet"]}, {"split": "2023_10_22T16_14_52.979927", "path": ["results_2023-10-22T16-14-52.979927.parquet"]}, {"split": "2023_10_23T02_48_34.723506", "path": ["results_2023-10-23T02-48-34.723506.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-48-34.723506.parquet"]}]}]}
|
2023-10-23T01:48:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T02:48:34.723506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:48:34.723506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:48:34.723506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:48:34.723506(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
fe166f7f2e6bf4dbeb3c9999d6375cb0c9791f18
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4.1-qlora
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4.1-qlora
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.4.1-qlora](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4.1-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4.1-qlora",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T17:32:07.033502](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4.1-qlora/blob/main/results_2023-10-22T17-32-07.033502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.014261744966442953,
"em_stderr": 0.0012142463031160381,
"f1": 0.06955117449664444,
"f1_stderr": 0.0017051232683047734,
"acc": 0.3629902469702419,
"acc_stderr": 0.00847293016647585
},
"harness|drop|3": {
"em": 0.014261744966442953,
"em_stderr": 0.0012142463031160381,
"f1": 0.06955117449664444,
"f1_stderr": 0.0017051232683047734
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749678
},
"harness|winogrande|5": {
"acc": 0.7032359905288083,
"acc_stderr": 0.012839239695202022
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4.1-qlora
|
[
"region:us"
] |
2023-08-18T10:21:12+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-7b-gpt4-1.4.1-qlora", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.4.1-qlora](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4.1-qlora) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4.1-qlora\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T17:32:07.033502](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.4.1-qlora/blob/main/results_2023-10-22T17-32-07.033502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.014261744966442953,\n \"em_stderr\": 0.0012142463031160381,\n \"f1\": 0.06955117449664444,\n \"f1_stderr\": 0.0017051232683047734,\n \"acc\": 0.3629902469702419,\n \"acc_stderr\": 0.00847293016647585\n },\n \"harness|drop|3\": {\n \"em\": 0.014261744966442953,\n \"em_stderr\": 0.0012142463031160381,\n \"f1\": 0.06955117449664444,\n \"f1_stderr\": 0.0017051232683047734\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \"acc_stderr\": 0.004106620637749678\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7032359905288083,\n \"acc_stderr\": 0.012839239695202022\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.4.1-qlora", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T17_32_07.033502", "path": ["**/details_harness|drop|3_2023-10-22T17-32-07.033502.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T17-32-07.033502.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T17_32_07.033502", "path": ["**/details_harness|gsm8k|5_2023-10-22T17-32-07.033502.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T17-32-07.033502.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T10:49:30.523050.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:49:30.523050.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T10:49:30.523050.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T17_32_07.033502", "path": ["**/details_harness|winogrande|5_2023-10-22T17-32-07.033502.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T17-32-07.033502.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_03T10_49_30.523050", "path": ["results_2023-08-03T10:49:30.523050.parquet"]}, {"split": "2023_10_22T17_32_07.033502", "path": ["results_2023-10-22T17-32-07.033502.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T17-32-07.033502.parquet"]}]}]}
|
2023-10-22T16:32:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4.1-qlora
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4.1-qlora on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T17:32:07.033502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4.1-qlora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4.1-qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T17:32:07.033502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4.1-qlora",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4.1-qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T17:32:07.033502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.4.1-qlora## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.4.1-qlora on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T17:32:07.033502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b6e585736b54b1ec97cab0ea20de2689757290e6
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T17:40:12.862636](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4/blob/main/results_2023-10-29T17-40-12.862636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.048133389261744965,
"em_stderr": 0.0021920523387187097,
"f1": 0.11759647651006695,
"f1_stderr": 0.0024834303220337473,
"acc": 0.4884045517729164,
"acc_stderr": 0.010955153685387409
},
"harness|drop|3": {
"em": 0.048133389261744965,
"em_stderr": 0.0021920523387187097,
"f1": 0.11759647651006695,
"f1_stderr": 0.0024834303220337473
},
"harness|gsm8k|5": {
"acc": 0.18043972706595907,
"acc_stderr": 0.010592508589147896
},
"harness|winogrande|5": {
"acc": 0.7963693764798737,
"acc_stderr": 0.011317798781626922
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4
|
[
"region:us"
] |
2023-08-18T10:21:22+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-65b-gpt4-1.4", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.4](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T17:40:12.862636](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.4/blob/main/results_2023-10-29T17-40-12.862636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.048133389261744965,\n \"em_stderr\": 0.0021920523387187097,\n \"f1\": 0.11759647651006695,\n \"f1_stderr\": 0.0024834303220337473,\n \"acc\": 0.4884045517729164,\n \"acc_stderr\": 0.010955153685387409\n },\n \"harness|drop|3\": {\n \"em\": 0.048133389261744965,\n \"em_stderr\": 0.0021920523387187097,\n \"f1\": 0.11759647651006695,\n \"f1_stderr\": 0.0024834303220337473\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.18043972706595907,\n \"acc_stderr\": 0.010592508589147896\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7963693764798737,\n \"acc_stderr\": 0.011317798781626922\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|arc:challenge|25_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T05_17_46.947970", "path": ["**/details_harness|drop|3_2023-10-23T05-17-46.947970.parquet"]}, {"split": "2023_10_23T05_35_00.206888", "path": ["**/details_harness|drop|3_2023-10-23T05-35-00.206888.parquet"]}, {"split": "2023_10_29T17_40_12.862636", "path": ["**/details_harness|drop|3_2023-10-29T17-40-12.862636.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T17-40-12.862636.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T05_17_46.947970", "path": ["**/details_harness|gsm8k|5_2023-10-23T05-17-46.947970.parquet"]}, {"split": "2023_10_23T05_35_00.206888", "path": ["**/details_harness|gsm8k|5_2023-10-23T05-35-00.206888.parquet"]}, {"split": "2023_10_29T17_40_12.862636", "path": ["**/details_harness|gsm8k|5_2023-10-29T17-40-12.862636.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T17-40-12.862636.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hellaswag|10_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T18:17:34.414751.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T18:17:34.414751.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T05_17_46.947970", "path": ["**/details_harness|winogrande|5_2023-10-23T05-17-46.947970.parquet"]}, {"split": "2023_10_23T05_35_00.206888", "path": ["**/details_harness|winogrande|5_2023-10-23T05-35-00.206888.parquet"]}, {"split": "2023_10_29T17_40_12.862636", "path": ["**/details_harness|winogrande|5_2023-10-29T17-40-12.862636.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T17-40-12.862636.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T18_17_34.414751", "path": ["results_2023-08-09T18:17:34.414751.parquet"]}, {"split": "2023_10_23T05_17_46.947970", "path": ["results_2023-10-23T05-17-46.947970.parquet"]}, {"split": "2023_10_23T05_35_00.206888", "path": ["results_2023-10-23T05-35-00.206888.parquet"]}, {"split": "2023_10_29T17_40_12.862636", "path": ["results_2023-10-29T17-40-12.862636.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T17-40-12.862636.parquet"]}]}]}
|
2023-10-29T17:40:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-29T17:40:12.862636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T17:40:12.862636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T17:40:12.862636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T17:40:12.862636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2445758173e414309a3431fb5833bc79a29f1ba8
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T00:22:24.283273](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3/blob/main/results_2023-10-19T00-22-24.283273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.40635486577181207,
"em_stderr": 0.00502985933530148,
"f1": 0.49071728187919794,
"f1_stderr": 0.0047528105237378505,
"acc": 0.4679967304402357,
"acc_stderr": 0.010353850140010314
},
"harness|drop|3": {
"em": 0.40635486577181207,
"em_stderr": 0.00502985933530148,
"f1": 0.49071728187919794,
"f1_stderr": 0.0047528105237378505
},
"harness|gsm8k|5": {
"acc": 0.13646702047005307,
"acc_stderr": 0.00945574199881554
},
"harness|winogrande|5": {
"acc": 0.7995264404104183,
"acc_stderr": 0.011251958281205085
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3
|
[
"region:us"
] |
2023-08-18T10:21:30+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-65b-gpt4-1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T00:22:24.283273](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.3/blob/main/results_2023-10-19T00-22-24.283273.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.40635486577181207,\n \"em_stderr\": 0.00502985933530148,\n \"f1\": 0.49071728187919794,\n \"f1_stderr\": 0.0047528105237378505,\n \"acc\": 0.4679967304402357,\n \"acc_stderr\": 0.010353850140010314\n },\n \"harness|drop|3\": {\n \"em\": 0.40635486577181207,\n \"em_stderr\": 0.00502985933530148,\n \"f1\": 0.49071728187919794,\n \"f1_stderr\": 0.0047528105237378505\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13646702047005307,\n \"acc_stderr\": 0.00945574199881554\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7995264404104183,\n \"acc_stderr\": 0.011251958281205085\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T00_22_24.283273", "path": ["**/details_harness|drop|3_2023-10-19T00-22-24.283273.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T00-22-24.283273.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T00_22_24.283273", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-22-24.283273.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T00-22-24.283273.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:21:18.857678.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T14:21:18.857678.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T00_22_24.283273", "path": ["**/details_harness|winogrande|5_2023-10-19T00-22-24.283273.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T00-22-24.283273.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T14_21_18.857678", "path": ["results_2023-08-09T14:21:18.857678.parquet"]}, {"split": "2023_10_19T00_22_24.283273", "path": ["results_2023-10-19T00-22-24.283273.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T00-22-24.283273.parquet"]}]}]}
|
2023-10-18T23:22:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.3 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T00:22:24.283273(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:22:24.283273(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T00:22:24.283273(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T00:22:24.283273(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8401847ad967108047f532b11d5aa5e53c1617b5
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T11:55:30.055248](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.3/blob/main/results_2023-10-22T11-55-30.055248.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.018036912751677854,
"em_stderr": 0.0013629136303228225,
"f1": 0.08087038590604041,
"f1_stderr": 0.0018769490051502272,
"acc": 0.3909936314193683,
"acc_stderr": 0.008100933725827915
},
"harness|drop|3": {
"em": 0.018036912751677854,
"em_stderr": 0.0013629136303228225,
"f1": 0.08087038590604041,
"f1_stderr": 0.0018769490051502272
},
"harness|gsm8k|5": {
"acc": 0.02350265352539803,
"acc_stderr": 0.004172883669643956
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011874
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.3
|
[
"region:us"
] |
2023-08-18T10:21:39+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-13b-gpt4-1.3", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.3](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T11:55:30.055248](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.3/blob/main/results_2023-10-22T11-55-30.055248.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018036912751677854,\n \"em_stderr\": 0.0013629136303228225,\n \"f1\": 0.08087038590604041,\n \"f1_stderr\": 0.0018769490051502272,\n \"acc\": 0.3909936314193683,\n \"acc_stderr\": 0.008100933725827915\n },\n \"harness|drop|3\": {\n \"em\": 0.018036912751677854,\n \"em_stderr\": 0.0013629136303228225,\n \"f1\": 0.08087038590604041,\n \"f1_stderr\": 0.0018769490051502272\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02350265352539803,\n \"acc_stderr\": 0.004172883669643956\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011874\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|arc:challenge|25_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T11_55_30.055248", "path": ["**/details_harness|drop|3_2023-10-22T11-55-30.055248.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T11-55-30.055248.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T11_55_30.055248", "path": ["**/details_harness|gsm8k|5_2023-10-22T11-55-30.055248.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T11-55-30.055248.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hellaswag|10_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T08:50:11.313288.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T08:50:11.313288.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T08:50:11.313288.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T11_55_30.055248", "path": ["**/details_harness|winogrande|5_2023-10-22T11-55-30.055248.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T11-55-30.055248.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T08_50_11.313288", "path": ["results_2023-08-09T08:50:11.313288.parquet"]}, {"split": "2023_10_22T11_55_30.055248", "path": ["results_2023-10-22T11-55-30.055248.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T11-55-30.055248.parquet"]}]}]}
|
2023-10-22T10:55:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.3 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T11:55:30.055248(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T11:55:30.055248(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T11:55:30.055248(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T11:55:30.055248(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ca6cdbe0af4f897a66850226da9b655f9a722dee
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.4-fp16](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T14:04:40.493722](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4-fp16/blob/main/results_2023-10-19T14-04-40.493722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.05285234899328859,
"em_stderr": 0.0022912930700355423,
"f1": 0.11820364932885902,
"f1_stderr": 0.0026017641356238645,
"acc": 0.41988112541310807,
"acc_stderr": 0.009659506214512746
},
"harness|drop|3": {
"em": 0.05285234899328859,
"em_stderr": 0.0022912930700355423,
"f1": 0.11820364932885902,
"f1_stderr": 0.0026017641356238645
},
"harness|gsm8k|5": {
"acc": 0.07733131159969674,
"acc_stderr": 0.007357713523222348
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803143
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4-fp16
|
[
"region:us"
] |
2023-08-18T10:21:48+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-13b-gpt4-1.4-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.4-fp16](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T14:04:40.493722](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.4-fp16/blob/main/results_2023-10-19T14-04-40.493722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.05285234899328859,\n \"em_stderr\": 0.0022912930700355423,\n \"f1\": 0.11820364932885902,\n \"f1_stderr\": 0.0026017641356238645,\n \"acc\": 0.41988112541310807,\n \"acc_stderr\": 0.009659506214512746\n },\n \"harness|drop|3\": {\n \"em\": 0.05285234899328859,\n \"em_stderr\": 0.0022912930700355423,\n \"f1\": 0.11820364932885902,\n \"f1_stderr\": 0.0026017641356238645\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07733131159969674,\n \"acc_stderr\": 0.007357713523222348\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803143\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.4-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|arc:challenge|25_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T14_04_40.493722", "path": ["**/details_harness|drop|3_2023-10-19T14-04-40.493722.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T14-04-40.493722.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T14_04_40.493722", "path": ["**/details_harness|gsm8k|5_2023-10-19T14-04-40.493722.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T14-04-40.493722.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hellaswag|10_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T11:11:18.095380.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T11:11:18.095380.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T11:11:18.095380.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T14_04_40.493722", "path": ["**/details_harness|winogrande|5_2023-10-19T14-04-40.493722.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T14-04-40.493722.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_03T11_11_18.095380", "path": ["results_2023-08-03T11:11:18.095380.parquet"]}, {"split": "2023_10_19T14_04_40.493722", "path": ["results_2023-10-19T14-04-40.493722.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T14-04-40.493722.parquet"]}]}]}
|
2023-10-19T13:04:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T14:04:40.493722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T14:04:40.493722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T14:04:40.493722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.4-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.4-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T14:04:40.493722(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0b4341a9a141437d762317165dd1b7db15066ef7
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T03:12:02.680525](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0/blob/main/results_2023-10-23T03-12-02.680525.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.34312080536912754,
"em_stderr": 0.004861898980661869,
"f1": 0.406266778523491,
"f1_stderr": 0.004698880247232182,
"acc": 0.5411001733512928,
"acc_stderr": 0.011156340755977264
},
"harness|drop|3": {
"em": 0.34312080536912754,
"em_stderr": 0.004861898980661869,
"f1": 0.406266778523491,
"f1_stderr": 0.004698880247232182
},
"harness|gsm8k|5": {
"acc": 0.24715693707354056,
"acc_stderr": 0.011881764043717088
},
"harness|winogrande|5": {
"acc": 0.835043409629045,
"acc_stderr": 0.010430917468237441
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0
|
[
"region:us"
] |
2023-08-18T10:21:56+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-gpt4-2.0](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T03:12:02.680525](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-2.0/blob/main/results_2023-10-23T03-12-02.680525.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.34312080536912754,\n \"em_stderr\": 0.004861898980661869,\n \"f1\": 0.406266778523491,\n \"f1_stderr\": 0.004698880247232182,\n \"acc\": 0.5411001733512928,\n \"acc_stderr\": 0.011156340755977264\n },\n \"harness|drop|3\": {\n \"em\": 0.34312080536912754,\n \"em_stderr\": 0.004861898980661869,\n \"f1\": 0.406266778523491,\n \"f1_stderr\": 0.004698880247232182\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24715693707354056,\n \"acc_stderr\": 0.011881764043717088\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.835043409629045,\n \"acc_stderr\": 0.010430917468237441\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|arc:challenge|25_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T09_58_31.478487", "path": ["**/details_harness|drop|3_2023-10-19T09-58-31.478487.parquet"]}, {"split": "2023_10_23T03_12_02.680525", "path": ["**/details_harness|drop|3_2023-10-23T03-12-02.680525.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T03-12-02.680525.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T09_58_31.478487", "path": ["**/details_harness|gsm8k|5_2023-10-19T09-58-31.478487.parquet"]}, {"split": "2023_10_23T03_12_02.680525", "path": ["**/details_harness|gsm8k|5_2023-10-23T03-12-02.680525.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T03-12-02.680525.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hellaswag|10_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:04:11.236941.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_08_19T00_48_59.636533", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-19T00:48:59.636533.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-19T00:48:59.636533.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T09_58_31.478487", "path": ["**/details_harness|winogrande|5_2023-10-19T09-58-31.478487.parquet"]}, {"split": "2023_10_23T03_12_02.680525", "path": ["**/details_harness|winogrande|5_2023-10-23T03-12-02.680525.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T03-12-02.680525.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T00_04_11.236941", "path": ["results_2023-08-10T00:04:11.236941.parquet"]}, {"split": "2023_10_19T09_58_31.478487", "path": ["results_2023-10-19T09-58-31.478487.parquet"]}, {"split": "2023_10_23T03_12_02.680525", "path": ["results_2023-10-23T03-12-02.680525.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T03-12-02.680525.parquet"]}]}]}
|
2023-10-23T02:12:15+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-2.0 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T03:12:02.680525(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T03:12:02.680525(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T03:12:02.680525(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-2.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T03:12:02.680525(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
82336da3d545220e782662417cc76f0c2ad910cb
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-m2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-m2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-m2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T01:01:59.537351](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-m2.0/blob/main/results_2023-10-23T01-01-59.537351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19955956375838926,
"em_stderr": 0.004092987650196818,
"f1": 0.2619746224832211,
"f1_stderr": 0.004049847572493045,
"acc": 0.5449064818543622,
"acc_stderr": 0.011200400992385444
},
"harness|drop|3": {
"em": 0.19955956375838926,
"em_stderr": 0.004092987650196818,
"f1": 0.2619746224832211,
"f1_stderr": 0.004049847572493045
},
"harness|gsm8k|5": {
"acc": 0.2539802880970432,
"acc_stderr": 0.011989952209548084
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222804
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-m2.0
|
[
"region:us"
] |
2023-08-18T10:22:04+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-l2-70b-gpt4-m2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-m2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T01:01:59.537351](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-m2.0/blob/main/results_2023-10-23T01-01-59.537351.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19955956375838926,\n \"em_stderr\": 0.004092987650196818,\n \"f1\": 0.2619746224832211,\n \"f1_stderr\": 0.004049847572493045,\n \"acc\": 0.5449064818543622,\n \"acc_stderr\": 0.011200400992385444\n },\n \"harness|drop|3\": {\n \"em\": 0.19955956375838926,\n \"em_stderr\": 0.004092987650196818,\n \"f1\": 0.2619746224832211,\n \"f1_stderr\": 0.004049847572493045\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2539802880970432,\n \"acc_stderr\": 0.011989952209548084\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222804\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-m2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|arc:challenge|25_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T01_01_59.537351", "path": ["**/details_harness|drop|3_2023-10-23T01-01-59.537351.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T01-01-59.537351.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T01_01_59.537351", "path": ["**/details_harness|gsm8k|5_2023-10-23T01-01-59.537351.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T01-01-59.537351.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hellaswag|10_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T22:06:19.540113.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T00:00:29.305175.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:00:29.305175.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T00:00:29.305175.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T01_01_59.537351", "path": ["**/details_harness|winogrande|5_2023-10-23T01-01-59.537351.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T01-01-59.537351.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T22_06_19.540113", "path": ["results_2023-08-09T22:06:19.540113.parquet"]}, {"split": "2023_08_10T00_00_29.305175", "path": ["results_2023-08-10T00:00:29.305175.parquet"]}, {"split": "2023_10_23T01_01_59.537351", "path": ["results_2023-10-23T01-01-59.537351.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T01-01-59.537351.parquet"]}]}]}
|
2023-10-23T00:02:12+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-m2.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-m2.0 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T01:01:59.537351(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-m2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T01:01:59.537351(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-m2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T01:01:59.537351(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-m2.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T01:01:59.537351(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2a9508dce12cba41c511762f597dcec04d0c0046
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b](https://huggingface.co/jondurbin/airoboros-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T23:27:03.840245](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b/blob/main/results_2023-10-22T23-27-03.840245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.11147231543624161,
"em_stderr": 0.0032229876723598116,
"f1": 0.18415897651006652,
"f1_stderr": 0.0034127687312130615,
"acc": 0.41609037484449546,
"acc_stderr": 0.009488844238408485
},
"harness|drop|3": {
"em": 0.11147231543624161,
"em_stderr": 0.0032229876723598116,
"f1": 0.18415897651006652,
"f1_stderr": 0.0034127687312130615
},
"harness|gsm8k|5": {
"acc": 0.06974981046247157,
"acc_stderr": 0.007016389571013826
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803145
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-13b
|
[
"region:us"
] |
2023-08-18T10:22:21+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b](https://huggingface.co/jondurbin/airoboros-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T23:27:03.840245](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b/blob/main/results_2023-10-22T23-27-03.840245.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.11147231543624161,\n \"em_stderr\": 0.0032229876723598116,\n \"f1\": 0.18415897651006652,\n \"f1_stderr\": 0.0034127687312130615,\n \"acc\": 0.41609037484449546,\n \"acc_stderr\": 0.009488844238408485\n },\n \"harness|drop|3\": {\n \"em\": 0.11147231543624161,\n \"em_stderr\": 0.0032229876723598116,\n \"f1\": 0.18415897651006652,\n \"f1_stderr\": 0.0034127687312130615\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06974981046247157,\n \"acc_stderr\": 0.007016389571013826\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803145\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T14_50_07.034775", "path": ["**/details_harness|drop|3_2023-10-22T14-50-07.034775.parquet"]}, {"split": "2023_10_22T23_27_03.840245", "path": ["**/details_harness|drop|3_2023-10-22T23-27-03.840245.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T23-27-03.840245.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T14_50_07.034775", "path": ["**/details_harness|gsm8k|5_2023-10-22T14-50-07.034775.parquet"]}, {"split": "2023_10_22T23_27_03.840245", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-27-03.840245.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T23-27-03.840245.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:43:26.994240.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:43:26.994240.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:43:26.994240.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T14_50_07.034775", "path": ["**/details_harness|winogrande|5_2023-10-22T14-50-07.034775.parquet"]}, {"split": "2023_10_22T23_27_03.840245", "path": ["**/details_harness|winogrande|5_2023-10-22T23-27-03.840245.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T23-27-03.840245.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T16_43_26.994240", "path": ["results_2023-07-18T16:43:26.994240.parquet"]}, {"split": "2023_10_22T14_50_07.034775", "path": ["results_2023-10-22T14-50-07.034775.parquet"]}, {"split": "2023_10_22T23_27_03.840245", "path": ["results_2023-10-22T23-27-03.840245.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T23-27-03.840245.parquet"]}]}]}
|
2023-10-22T22:27:12+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T23:27:03.840245(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T23:27:03.840245(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T23:27:03.840245(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T23:27:03.840245(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
12e17cdd3aa1ef39ea274ff94fd55d8d1c9fbea7
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T09:57:02.769369](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.2/blob/main/results_2023-10-22T09-57-02.769369.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.12531459731543623,
"em_stderr": 0.003390520871377358,
"f1": 0.1920878775167779,
"f1_stderr": 0.0034989908215168806,
"acc": 0.4364302798094512,
"acc_stderr": 0.0099585816929879
},
"harness|drop|3": {
"em": 0.12531459731543623,
"em_stderr": 0.003390520871377358,
"f1": 0.1920878775167779,
"f1_stderr": 0.0034989908215168806
},
"harness|gsm8k|5": {
"acc": 0.0978013646702047,
"acc_stderr": 0.008182119821849056
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126746
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.2
|
[
"region:us"
] |
2023-08-18T10:22:30+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-33b-gpt4-1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T09:57:02.769369](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-1.2/blob/main/results_2023-10-22T09-57-02.769369.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.12531459731543623,\n \"em_stderr\": 0.003390520871377358,\n \"f1\": 0.1920878775167779,\n \"f1_stderr\": 0.0034989908215168806,\n \"acc\": 0.4364302798094512,\n \"acc_stderr\": 0.0099585816929879\n },\n \"harness|drop|3\": {\n \"em\": 0.12531459731543623,\n \"em_stderr\": 0.003390520871377358,\n \"f1\": 0.1920878775167779,\n \"f1_stderr\": 0.0034989908215168806\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0978013646702047,\n \"acc_stderr\": 0.008182119821849056\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126746\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-33b-gpt4-1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|arc:challenge|25_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T07_36_51.791701", "path": ["**/details_harness|drop|3_2023-10-22T07-36-51.791701.parquet"]}, {"split": "2023_10_22T09_57_02.769369", "path": ["**/details_harness|drop|3_2023-10-22T09-57-02.769369.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T09-57-02.769369.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T07_36_51.791701", "path": ["**/details_harness|gsm8k|5_2023-10-22T07-36-51.791701.parquet"]}, {"split": "2023_10_22T09_57_02.769369", "path": ["**/details_harness|gsm8k|5_2023-10-22T09-57-02.769369.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T09-57-02.769369.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hellaswag|10_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T12:34:22.345109.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T12:34:22.345109.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T12:34:22.345109.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T07_36_51.791701", "path": ["**/details_harness|winogrande|5_2023-10-22T07-36-51.791701.parquet"]}, {"split": "2023_10_22T09_57_02.769369", "path": ["**/details_harness|winogrande|5_2023-10-22T09-57-02.769369.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T09-57-02.769369.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T12_34_22.345109", "path": ["results_2023-07-31T12:34:22.345109.parquet"]}, {"split": "2023_10_22T07_36_51.791701", "path": ["results_2023-10-22T07-36-51.791701.parquet"]}, {"split": "2023_10_22T09_57_02.769369", "path": ["results_2023-10-22T09-57-02.769369.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T09-57-02.769369.parquet"]}]}]}
|
2023-10-22T08:57:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T09:57:02.769369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T09:57:02.769369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T09:57:02.769369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T09:57:02.769369(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
78d9e6b879f10422dbe23cde2b8268fcdb39cadc
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T05:59:09.159543](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0/blob/main/results_2023-10-22T05-59-09.159543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.30432046979865773,
"em_stderr": 0.004712049527083924,
"f1": 0.37717596476510223,
"f1_stderr": 0.00456045095000614,
"acc": 0.4400130926002275,
"acc_stderr": 0.009847939494812614
},
"harness|drop|3": {
"em": 0.30432046979865773,
"em_stderr": 0.004712049527083924,
"f1": 0.37717596476510223,
"f1_stderr": 0.00456045095000614
},
"harness|gsm8k|5": {
"acc": 0.09628506444275967,
"acc_stderr": 0.008125264128215886
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409345
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0
|
[
"region:us"
] |
2023-08-18T10:22:38+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T05:59:09.159543](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-gpt4-m2.0/blob/main/results_2023-10-22T05-59-09.159543.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.30432046979865773,\n \"em_stderr\": 0.004712049527083924,\n \"f1\": 0.37717596476510223,\n \"f1_stderr\": 0.00456045095000614,\n \"acc\": 0.4400130926002275,\n \"acc_stderr\": 0.009847939494812614\n },\n \"harness|drop|3\": {\n \"em\": 0.30432046979865773,\n \"em_stderr\": 0.004712049527083924,\n \"f1\": 0.37717596476510223,\n \"f1_stderr\": 0.00456045095000614\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09628506444275967,\n \"acc_stderr\": 0.008125264128215886\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409345\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-33b-gpt4-m2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|arc:challenge|25_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T10_51_22.664215", "path": ["**/details_harness|drop|3_2023-10-19T10-51-22.664215.parquet"]}, {"split": "2023_10_21T18_09_50.123692", "path": ["**/details_harness|drop|3_2023-10-21T18-09-50.123692.parquet"]}, {"split": "2023_10_22T05_59_09.159543", "path": ["**/details_harness|drop|3_2023-10-22T05-59-09.159543.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T05-59-09.159543.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T10_51_22.664215", "path": ["**/details_harness|gsm8k|5_2023-10-19T10-51-22.664215.parquet"]}, {"split": "2023_10_21T18_09_50.123692", "path": ["**/details_harness|gsm8k|5_2023-10-21T18-09-50.123692.parquet"]}, {"split": "2023_10_22T05_59_09.159543", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-59-09.159543.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-59-09.159543.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hellaswag|10_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T16:13:19.014173.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-02T16:13:19.014173.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T10_51_22.664215", "path": ["**/details_harness|winogrande|5_2023-10-19T10-51-22.664215.parquet"]}, {"split": "2023_10_21T18_09_50.123692", "path": ["**/details_harness|winogrande|5_2023-10-21T18-09-50.123692.parquet"]}, {"split": "2023_10_22T05_59_09.159543", "path": ["**/details_harness|winogrande|5_2023-10-22T05-59-09.159543.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T05-59-09.159543.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_02T16_13_19.014173", "path": ["results_2023-08-02T16:13:19.014173.parquet"]}, {"split": "2023_10_19T10_51_22.664215", "path": ["results_2023-10-19T10-51-22.664215.parquet"]}, {"split": "2023_10_21T18_09_50.123692", "path": ["results_2023-10-21T18-09-50.123692.parquet"]}, {"split": "2023_10_22T05_59_09.159543", "path": ["results_2023-10-22T05-59-09.159543.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T05-59-09.159543.parquet"]}]}]}
|
2023-10-22T04:59:22+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-m2.0 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T05:59:09.159543(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:59:09.159543(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:59:09.159543(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-33b-gpt4-m2.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T05:59:09.159543(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
18fdaa643e9056046830be72d988853bee6796e6
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T15:46:08.253072](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.2/blob/main/results_2023-10-21T15-46-08.253072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1579278523489933,
"em_stderr": 0.00373459634198771,
"f1": 0.21763213087248284,
"f1_stderr": 0.003838141702918339,
"acc": 0.3689408577089266,
"acc_stderr": 0.008317600432676979
},
"harness|drop|3": {
"em": 0.1579278523489933,
"em_stderr": 0.00373459634198771,
"f1": 0.21763213087248284,
"f1_stderr": 0.003838141702918339
},
"harness|gsm8k|5": {
"acc": 0.02122820318423048,
"acc_stderr": 0.003970449129848635
},
"harness|winogrande|5": {
"acc": 0.7166535122336227,
"acc_stderr": 0.012664751735505323
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.2
|
[
"region:us"
] |
2023-08-18T10:22:46+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-7b-gpt4-1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T15:46:08.253072](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.2/blob/main/results_2023-10-21T15-46-08.253072.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1579278523489933,\n \"em_stderr\": 0.00373459634198771,\n \"f1\": 0.21763213087248284,\n \"f1_stderr\": 0.003838141702918339,\n \"acc\": 0.3689408577089266,\n \"acc_stderr\": 0.008317600432676979\n },\n \"harness|drop|3\": {\n \"em\": 0.1579278523489933,\n \"em_stderr\": 0.00373459634198771,\n \"f1\": 0.21763213087248284,\n \"f1_stderr\": 0.003838141702918339\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02122820318423048,\n \"acc_stderr\": 0.003970449129848635\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7166535122336227,\n \"acc_stderr\": 0.012664751735505323\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|arc:challenge|25_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T15_46_08.253072", "path": ["**/details_harness|drop|3_2023-10-21T15-46-08.253072.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T15-46-08.253072.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T15_46_08.253072", "path": ["**/details_harness|gsm8k|5_2023-10-21T15-46-08.253072.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T15-46-08.253072.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hellaswag|10_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T14:04:39.266883.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T14:04:39.266883.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T14:04:39.266883.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T15_46_08.253072", "path": ["**/details_harness|winogrande|5_2023-10-21T15-46-08.253072.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T15-46-08.253072.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T14_04_39.266883", "path": ["results_2023-07-31T14:04:39.266883.parquet"]}, {"split": "2023_10_21T15_46_08.253072", "path": ["results_2023-10-21T15-46-08.253072.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T15-46-08.253072.parquet"]}]}]}
|
2023-10-21T14:46:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T15:46:08.253072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T15:46:08.253072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T15:46:08.253072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T15:46:08.253072(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
6f54b6bd031b3acf95de0428ba924e57a5d128df
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-1.4.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-1.4.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-1.4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-1.4.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T02:42:13.778888](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-1.4.1/blob/main/results_2023-10-23T02-42-13.778888.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.07455956375838926,
"em_stderr": 0.002690082296020664,
"f1": 0.14032193791946285,
"f1_stderr": 0.0028633381416712784,
"acc": 0.5305016296936343,
"acc_stderr": 0.010958117599758718
},
"harness|drop|3": {
"em": 0.07455956375838926,
"em_stderr": 0.002690082296020664,
"f1": 0.14032193791946285,
"f1_stderr": 0.0028633381416712784
},
"harness|gsm8k|5": {
"acc": 0.22517058377558757,
"acc_stderr": 0.011505385424294634
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222804
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-1.4.1
|
[
"region:us"
] |
2023-08-18T10:22:54+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-l2-70b-gpt4-1.4.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-70b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-1.4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-1.4.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:42:13.778888](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-70b-gpt4-1.4.1/blob/main/results_2023-10-23T02-42-13.778888.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07455956375838926,\n \"em_stderr\": 0.002690082296020664,\n \"f1\": 0.14032193791946285,\n \"f1_stderr\": 0.0028633381416712784,\n \"acc\": 0.5305016296936343,\n \"acc_stderr\": 0.010958117599758718\n },\n \"harness|drop|3\": {\n \"em\": 0.07455956375838926,\n \"em_stderr\": 0.002690082296020664,\n \"f1\": 0.14032193791946285,\n \"f1_stderr\": 0.0028633381416712784\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22517058377558757,\n \"acc_stderr\": 0.011505385424294634\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222804\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-l2-70b-gpt4-1.4.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|arc:challenge|25_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T02_42_13.778888", "path": ["**/details_harness|drop|3_2023-10-23T02-42-13.778888.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-42-13.778888.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T02_42_13.778888", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-42-13.778888.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-42-13.778888.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hellaswag|10_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-26T00:46:50.622735.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T00:46:50.622735.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-26T00:46:50.622735.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T02_42_13.778888", "path": ["**/details_harness|winogrande|5_2023-10-23T02-42-13.778888.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-42-13.778888.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_26T00_46_50.622735", "path": ["results_2023-07-26T00:46:50.622735.parquet"]}, {"split": "2023_10_23T02_42_13.778888", "path": ["results_2023-10-23T02-42-13.778888.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-42-13.778888.parquet"]}]}]}
|
2023-10-23T01:42:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-1.4.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-1.4.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T02:42:13.778888(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-1.4.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:42:13.778888(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-1.4.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:42:13.778888(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-l2-70b-gpt4-1.4.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-70b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:42:13.778888(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
66a767b0f129e0b7dc4cf15a7ee7d865a1dd265d
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T19:27:20.004298](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.2/blob/main/results_2023-10-22T19-27-20.004298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.16883389261744966,
"em_stderr": 0.0038363072356365218,
"f1": 0.24563443791946302,
"f1_stderr": 0.003921848688674613,
"acc": 0.5304663251500592,
"acc_stderr": 0.01174788303833344
},
"harness|drop|3": {
"em": 0.16883389261744966,
"em_stderr": 0.0038363072356365218,
"f1": 0.24563443791946302,
"f1_stderr": 0.003921848688674613
},
"harness|gsm8k|5": {
"acc": 0.265352539802881,
"acc_stderr": 0.012161675464069677
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597202
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.2
|
[
"region:us"
] |
2023-08-18T10:23:03+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-65b-gpt4-1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T19:27:20.004298](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-1.2/blob/main/results_2023-10-22T19-27-20.004298.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.16883389261744966,\n \"em_stderr\": 0.0038363072356365218,\n \"f1\": 0.24563443791946302,\n \"f1_stderr\": 0.003921848688674613,\n \"acc\": 0.5304663251500592,\n \"acc_stderr\": 0.01174788303833344\n },\n \"harness|drop|3\": {\n \"em\": 0.16883389261744966,\n \"em_stderr\": 0.0038363072356365218,\n \"f1\": 0.24563443791946302,\n \"f1_stderr\": 0.003921848688674613\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.265352539802881,\n \"acc_stderr\": 0.012161675464069677\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597202\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-65b-gpt4-1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|arc:challenge|25_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T18_25_44.617546", "path": ["**/details_harness|drop|3_2023-10-22T18-25-44.617546.parquet"]}, {"split": "2023_10_22T19_27_20.004298", "path": ["**/details_harness|drop|3_2023-10-22T19-27-20.004298.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T19-27-20.004298.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T18_25_44.617546", "path": ["**/details_harness|gsm8k|5_2023-10-22T18-25-44.617546.parquet"]}, {"split": "2023_10_22T19_27_20.004298", "path": ["**/details_harness|gsm8k|5_2023-10-22T19-27-20.004298.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T19-27-20.004298.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hellaswag|10_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:32:20.541789.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-03T17:35:02.727730.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T17:35:02.727730.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-03T17:35:02.727730.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T18_25_44.617546", "path": ["**/details_harness|winogrande|5_2023-10-22T18-25-44.617546.parquet"]}, {"split": "2023_10_22T19_27_20.004298", "path": ["**/details_harness|winogrande|5_2023-10-22T19-27-20.004298.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T19-27-20.004298.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T19_32_20.541789", "path": ["results_2023-07-25T19:32:20.541789.parquet"]}, {"split": "2023_08_03T17_35_02.727730", "path": ["results_2023-08-03T17:35:02.727730.parquet"]}, {"split": "2023_10_22T18_25_44.617546", "path": ["results_2023-10-22T18-25-44.617546.parquet"]}, {"split": "2023_10_22T19_27_20.004298", "path": ["results_2023-10-22T19-27-20.004298.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T19-27-20.004298.parquet"]}]}]}
|
2023-10-22T18:27:28+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T19:27:20.004298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T19:27:20.004298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T19:27:20.004298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T19:27:20.004298(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
dad3bc7f726fe51dbec37626d90af26910b2c527
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T14:40:33.943604](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.2/blob/main/results_2023-10-22T14-40-33.943604.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15058724832214765,
"em_stderr": 0.0036626307588979133,
"f1": 0.23444735738255124,
"f1_stderr": 0.0038225888705093877,
"acc": 0.3875254087996874,
"acc_stderr": 0.008846716230878262
},
"harness|drop|3": {
"em": 0.15058724832214765,
"em_stderr": 0.0036626307588979133,
"f1": 0.23444735738255124,
"f1_stderr": 0.0038225888705093877
},
"harness|gsm8k|5": {
"acc": 0.03866565579984837,
"acc_stderr": 0.005310583162098057
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658468
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.2
|
[
"region:us"
] |
2023-08-18T10:23:13+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-13b-gpt4-1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-13b-gpt4-1.2](https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T14:40:33.943604](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-13b-gpt4-1.2/blob/main/results_2023-10-22T14-40-33.943604.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15058724832214765,\n \"em_stderr\": 0.0036626307588979133,\n \"f1\": 0.23444735738255124,\n \"f1_stderr\": 0.0038225888705093877,\n \"acc\": 0.3875254087996874,\n \"acc_stderr\": 0.008846716230878262\n },\n \"harness|drop|3\": {\n \"em\": 0.15058724832214765,\n \"em_stderr\": 0.0036626307588979133,\n \"f1\": 0.23444735738255124,\n \"f1_stderr\": 0.0038225888705093877\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03866565579984837,\n \"acc_stderr\": 0.005310583162098057\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658468\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-13b-gpt4-1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|arc:challenge|25_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T14_40_33.943604", "path": ["**/details_harness|drop|3_2023-10-22T14-40-33.943604.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T14-40-33.943604.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T14_40_33.943604", "path": ["**/details_harness|gsm8k|5_2023-10-22T14-40-33.943604.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T14-40-33.943604.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hellaswag|10_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-08T16:30:54.666382.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-08T16:30:54.666382.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-08T16:30:54.666382.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T14_40_33.943604", "path": ["**/details_harness|winogrande|5_2023-10-22T14-40-33.943604.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T14-40-33.943604.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_08T16_30_54.666382", "path": ["results_2023-08-08T16:30:54.666382.parquet"]}, {"split": "2023_10_22T14_40_33.943604", "path": ["results_2023-10-22T14-40-33.943604.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T14-40-33.943604.parquet"]}]}]}
|
2023-10-22T13:40:45+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T14:40:33.943604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T14:40:33.943604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T14:40:33.943604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-13b-gpt4-1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-13b-gpt4-1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T14:40:33.943604(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
99a611005a15d7209b7eb9989e087d9a7508d86e
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-gpt4-1.4.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-1.4.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-7b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-1.4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-1.4.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T13:45:24.528453](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-1.4.1/blob/main/results_2023-10-22T13-45-24.528453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.006816275167785235,
"em_stderr": 0.0008426127095859245,
"f1": 0.06961933724832228,
"f1_stderr": 0.0015867120919901235,
"acc": 0.38537542193417434,
"acc_stderr": 0.008417109576351015
},
"harness|drop|3": {
"em": 0.006816275167785235,
"em_stderr": 0.0008426127095859245,
"f1": 0.06961933724832228,
"f1_stderr": 0.0015867120919901235
},
"harness|gsm8k|5": {
"acc": 0.028051554207733132,
"acc_stderr": 0.004548229533836327
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.012285989618865704
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-1.4.1
|
[
"region:us"
] |
2023-08-18T10:23:30+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-l2-7b-gpt4-1.4.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-l2-7b-gpt4-1.4.1](https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-1.4.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-1.4.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T13:45:24.528453](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-l2-7b-gpt4-1.4.1/blob/main/results_2023-10-22T13-45-24.528453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.006816275167785235,\n \"em_stderr\": 0.0008426127095859245,\n \"f1\": 0.06961933724832228,\n \"f1_stderr\": 0.0015867120919901235,\n \"acc\": 0.38537542193417434,\n \"acc_stderr\": 0.008417109576351015\n },\n \"harness|drop|3\": {\n \"em\": 0.006816275167785235,\n \"em_stderr\": 0.0008426127095859245,\n \"f1\": 0.06961933724832228,\n \"f1_stderr\": 0.0015867120919901235\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.028051554207733132,\n \"acc_stderr\": 0.004548229533836327\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865704\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-l2-7b-gpt4-1.4.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|arc:challenge|25_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T13_45_24.528453", "path": ["**/details_harness|drop|3_2023-10-22T13-45-24.528453.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T13-45-24.528453.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T13_45_24.528453", "path": ["**/details_harness|gsm8k|5_2023-10-22T13-45-24.528453.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T13-45-24.528453.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hellaswag|10_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T12:34:49.060263.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T12:34:49.060263.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T12:34:49.060263.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T13_45_24.528453", "path": ["**/details_harness|winogrande|5_2023-10-22T13-45-24.528453.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T13-45-24.528453.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T12_34_49.060263", "path": ["results_2023-07-25T12:34:49.060263.parquet"]}, {"split": "2023_10_22T13_45_24.528453", "path": ["results_2023-10-22T13-45-24.528453.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T13-45-24.528453.parquet"]}]}]}
|
2023-10-22T12:45:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-gpt4-1.4.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-l2-7b-gpt4-1.4.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T13:45:24.528453(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-gpt4-1.4.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-7b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T13:45:24.528453(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-gpt4-1.4.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-7b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T13:45:24.528453(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-l2-7b-gpt4-1.4.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-l2-7b-gpt4-1.4.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T13:45:24.528453(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
47b785cb9442db986045c0f1748bbfffed73d065
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-m2.0
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-65b-gpt4-m2.0
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-65b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-m2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T21:36:42.557922](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-m2.0/blob/main/results_2023-10-22T21-36-42.557922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.07036493288590603,
"em_stderr": 0.0026192324279004876,
"f1": 0.14583787751677768,
"f1_stderr": 0.002841532518554861,
"acc": 0.5116370357826509,
"acc_stderr": 0.011318931374370282
},
"harness|drop|3": {
"em": 0.07036493288590603,
"em_stderr": 0.0026192324279004876,
"f1": 0.14583787751677768,
"f1_stderr": 0.002841532518554861
},
"harness|gsm8k|5": {
"acc": 0.221379833206975,
"acc_stderr": 0.011436000004253518
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487047
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-m2.0
|
[
"region:us"
] |
2023-08-18T10:23:39+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-65b-gpt4-m2.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-65b-gpt4-m2.0](https://huggingface.co/jondurbin/airoboros-65b-gpt4-m2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-m2.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T21:36:42.557922](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-65b-gpt4-m2.0/blob/main/results_2023-10-22T21-36-42.557922.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07036493288590603,\n \"em_stderr\": 0.0026192324279004876,\n \"f1\": 0.14583787751677768,\n \"f1_stderr\": 0.002841532518554861,\n \"acc\": 0.5116370357826509,\n \"acc_stderr\": 0.011318931374370282\n },\n \"harness|drop|3\": {\n \"em\": 0.07036493288590603,\n \"em_stderr\": 0.0026192324279004876,\n \"f1\": 0.14583787751677768,\n \"f1_stderr\": 0.002841532518554861\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.221379833206975,\n \"acc_stderr\": 0.011436000004253518\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487047\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-65b-gpt4-m2.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|arc:challenge|25_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|arc:challenge|25_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T15_08_22.403545", "path": ["**/details_harness|drop|3_2023-10-22T15-08-22.403545.parquet"]}, {"split": "2023_10_22T21_36_42.557922", "path": ["**/details_harness|drop|3_2023-10-22T21-36-42.557922.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T21-36-42.557922.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T15_08_22.403545", "path": ["**/details_harness|gsm8k|5_2023-10-22T15-08-22.403545.parquet"]}, {"split": "2023_10_22T21_36_42.557922", "path": ["**/details_harness|gsm8k|5_2023-10-22T21-36-42.557922.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T21-36-42.557922.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hellaswag|10_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hellaswag|10_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T17:03:24.422206.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T18:28:50.823349.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T18:28:50.823349.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T18:28:50.823349.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T15_08_22.403545", "path": ["**/details_harness|winogrande|5_2023-10-22T15-08-22.403545.parquet"]}, {"split": "2023_10_22T21_36_42.557922", "path": ["**/details_harness|winogrande|5_2023-10-22T21-36-42.557922.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T21-36-42.557922.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T17_03_24.422206", "path": ["results_2023-08-09T17:03:24.422206.parquet"]}, {"split": "2023_08_09T18_28_50.823349", "path": ["results_2023-08-09T18:28:50.823349.parquet"]}, {"split": "2023_10_22T15_08_22.403545", "path": ["results_2023-10-22T15-08-22.403545.parquet"]}, {"split": "2023_10_22T21_36_42.557922", "path": ["results_2023-10-22T21-36-42.557922.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T21-36-42.557922.parquet"]}]}]}
|
2023-10-22T20:36:54+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-m2.0
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-m2.0 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T21:36:42.557922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-m2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T21:36:42.557922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-m2.0",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T21:36:42.557922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-65b-gpt4-m2.0## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-65b-gpt4-m2.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T21:36:42.557922(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
4c4aee1fe9d0c7453dada0c484f1d5ad69c75d95
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.1](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T13:09:52.806111](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1/blob/main/results_2023-10-22T13-09-52.806111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19798657718120805,
"em_stderr": 0.00408082849939278,
"f1": 0.2537437080536912,
"f1_stderr": 0.004098830726202191,
"acc": 0.38097222729184826,
"acc_stderr": 0.008622604334831044
},
"harness|drop|3": {
"em": 0.19798657718120805,
"em_stderr": 0.00408082849939278,
"f1": 0.2537437080536912,
"f1_stderr": 0.004098830726202191
},
"harness|gsm8k|5": {
"acc": 0.0310841546626232,
"acc_stderr": 0.004780296718393349
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1
|
[
"region:us"
] |
2023-08-18T10:23:56+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-7b-gpt4-1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-7b-gpt4-1.1](https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T13:09:52.806111](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-7b-gpt4-1.1/blob/main/results_2023-10-22T13-09-52.806111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19798657718120805,\n \"em_stderr\": 0.00408082849939278,\n \"f1\": 0.2537437080536912,\n \"f1_stderr\": 0.004098830726202191,\n \"acc\": 0.38097222729184826,\n \"acc_stderr\": 0.008622604334831044\n },\n \"harness|drop|3\": {\n \"em\": 0.19798657718120805,\n \"em_stderr\": 0.00408082849939278,\n \"f1\": 0.2537437080536912,\n \"f1_stderr\": 0.004098830726202191\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0310841546626232,\n \"acc_stderr\": 0.004780296718393349\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268738\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-7b-gpt4-1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T13_09_52.806111", "path": ["**/details_harness|drop|3_2023-10-22T13-09-52.806111.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T13-09-52.806111.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T13_09_52.806111", "path": ["**/details_harness|gsm8k|5_2023-10-22T13-09-52.806111.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T13-09-52.806111.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:46:19.144094.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:46:19.144094.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T13_09_52.806111", "path": ["**/details_harness|winogrande|5_2023-10-22T13-09-52.806111.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T13-09-52.806111.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T13_46_19.144094", "path": ["results_2023-07-31T13:46:19.144094.parquet"]}, {"split": "2023_10_22T13_09_52.806111", "path": ["results_2023-10-22T13-09-52.806111.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T13-09-52.806111.parquet"]}]}]}
|
2023-10-22T12:10:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T13:09:52.806111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T13:09:52.806111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T13:09:52.806111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-7b-gpt4-1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-7b-gpt4-1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T13:09:52.806111(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3bea9332cc9c0196adace1e7cd4edee0b1d74861
|
# Dataset Card for Evaluation run of WangZeJun/bloom-820m-chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/WangZeJun/bloom-820m-chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [WangZeJun/bloom-820m-chat](https://huggingface.co/WangZeJun/bloom-820m-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_WangZeJun__bloom-820m-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T22:00:08.030398](https://huggingface.co/datasets/open-llm-leaderboard/details_WangZeJun__bloom-820m-chat/blob/main/results_2023-09-17T22-00-08.030398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0382760067114094,
"em_stderr": 0.0019648445106113157,
"f1": 0.08853187919463057,
"f1_stderr": 0.0023716202448817885,
"acc": 0.265982636148382,
"acc_stderr": 0.007011869610583192
},
"harness|drop|3": {
"em": 0.0382760067114094,
"em_stderr": 0.0019648445106113157,
"f1": 0.08853187919463057,
"f1_stderr": 0.0023716202448817885
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.531965272296764,
"acc_stderr": 0.014023739221166384
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_WangZeJun__bloom-820m-chat
|
[
"region:us"
] |
2023-08-18T10:24:05+00:00
|
{"pretty_name": "Evaluation run of WangZeJun/bloom-820m-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [WangZeJun/bloom-820m-chat](https://huggingface.co/WangZeJun/bloom-820m-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_WangZeJun__bloom-820m-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T22:00:08.030398](https://huggingface.co/datasets/open-llm-leaderboard/details_WangZeJun__bloom-820m-chat/blob/main/results_2023-09-17T22-00-08.030398.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0382760067114094,\n \"em_stderr\": 0.0019648445106113157,\n \"f1\": 0.08853187919463057,\n \"f1_stderr\": 0.0023716202448817885,\n \"acc\": 0.265982636148382,\n \"acc_stderr\": 0.007011869610583192\n },\n \"harness|drop|3\": {\n \"em\": 0.0382760067114094,\n \"em_stderr\": 0.0019648445106113157,\n \"f1\": 0.08853187919463057,\n \"f1_stderr\": 0.0023716202448817885\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.531965272296764,\n \"acc_stderr\": 0.014023739221166384\n }\n}\n```", "repo_url": "https://huggingface.co/WangZeJun/bloom-820m-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|arc:challenge|25_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T22_00_08.030398", "path": ["**/details_harness|drop|3_2023-09-17T22-00-08.030398.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T22-00-08.030398.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T22_00_08.030398", "path": ["**/details_harness|gsm8k|5_2023-09-17T22-00-08.030398.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T22-00-08.030398.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hellaswag|10_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-17T10:54:24.303970.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T10:54:24.303970.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-17T10:54:24.303970.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T22_00_08.030398", "path": ["**/details_harness|winogrande|5_2023-09-17T22-00-08.030398.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T22-00-08.030398.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_17T10_54_24.303970", "path": ["results_2023-08-17T10:54:24.303970.parquet"]}, {"split": "2023_09_17T22_00_08.030398", "path": ["results_2023-09-17T22-00-08.030398.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T22-00-08.030398.parquet"]}]}]}
|
2023-09-17T21:00:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of WangZeJun/bloom-820m-chat
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model WangZeJun/bloom-820m-chat on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T22:00:08.030398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of WangZeJun/bloom-820m-chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WangZeJun/bloom-820m-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T22:00:08.030398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of WangZeJun/bloom-820m-chat",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model WangZeJun/bloom-820m-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T22:00:08.030398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of WangZeJun/bloom-820m-chat## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model WangZeJun/bloom-820m-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T22:00:08.030398(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
599aba0c83143fb55b39e2dc366237814715d656
|
# Dataset Card for Evaluation run of TheBloke/airoboros-7b-gpt4-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/airoboros-7b-gpt4-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/airoboros-7b-gpt4-fp16](https://huggingface.co/TheBloke/airoboros-7b-gpt4-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__airoboros-7b-gpt4-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T11:48:44.859139](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-7b-gpt4-fp16/blob/main/results_2023-10-22T11-48-44.859139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24276426174496643,
"em_stderr": 0.004390839668047224,
"f1": 0.3038569630872493,
"f1_stderr": 0.004387376487144696,
"acc": 0.37414887626834564,
"acc_stderr": 0.008035199409633497
},
"harness|drop|3": {
"em": 0.24276426174496643,
"em_stderr": 0.004390839668047224,
"f1": 0.3038569630872493,
"f1_stderr": 0.004387376487144696
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.0036054868679982572
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268738
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__airoboros-7b-gpt4-fp16
|
[
"region:us"
] |
2023-08-18T10:24:14+00:00
|
{"pretty_name": "Evaluation run of TheBloke/airoboros-7b-gpt4-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/airoboros-7b-gpt4-fp16](https://huggingface.co/TheBloke/airoboros-7b-gpt4-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__airoboros-7b-gpt4-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T11:48:44.859139](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-7b-gpt4-fp16/blob/main/results_2023-10-22T11-48-44.859139.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24276426174496643,\n \"em_stderr\": 0.004390839668047224,\n \"f1\": 0.3038569630872493,\n \"f1_stderr\": 0.004387376487144696,\n \"acc\": 0.37414887626834564,\n \"acc_stderr\": 0.008035199409633497\n },\n \"harness|drop|3\": {\n \"em\": 0.24276426174496643,\n \"em_stderr\": 0.004390839668047224,\n \"f1\": 0.3038569630872493,\n \"f1_stderr\": 0.004387376487144696\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.0036054868679982572\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268738\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/airoboros-7b-gpt4-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T11_48_44.859139", "path": ["**/details_harness|drop|3_2023-10-22T11-48-44.859139.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T11-48-44.859139.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T11_48_44.859139", "path": ["**/details_harness|gsm8k|5_2023-10-22T11-48-44.859139.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T11-48-44.859139.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:47:19.580481.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:47:19.580481.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:47:19.580481.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T11_48_44.859139", "path": ["**/details_harness|winogrande|5_2023-10-22T11-48-44.859139.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T11-48-44.859139.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_47_19.580481", "path": ["results_2023-07-19T17:47:19.580481.parquet"]}, {"split": "2023_10_22T11_48_44.859139", "path": ["results_2023-10-22T11-48-44.859139.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T11-48-44.859139.parquet"]}]}]}
|
2023-10-22T10:48:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/airoboros-7b-gpt4-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/airoboros-7b-gpt4-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T11:48:44.859139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/airoboros-7b-gpt4-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/airoboros-7b-gpt4-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T11:48:44.859139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/airoboros-7b-gpt4-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/airoboros-7b-gpt4-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T11:48:44.859139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/airoboros-7b-gpt4-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/airoboros-7b-gpt4-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T11:48:44.859139(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b5c888f93ae9323c5c892924282abfa5cf90d77c
|
# Dataset Card for Evaluation run of TheBloke/dromedary-65b-lora-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/dromedary-65b-lora-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/dromedary-65b-lora-HF](https://huggingface.co/TheBloke/dromedary-65b-lora-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__dromedary-65b-lora-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T03:08:41.091963](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__dromedary-65b-lora-HF/blob/main/results_2023-10-15T03-08-41.091963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417735,
"f1": 0.058895763422818985,
"f1_stderr": 0.0012985937732460785,
"acc": 0.5318581619018498,
"acc_stderr": 0.01187681379526279
},
"harness|drop|3": {
"em": 0.001572986577181208,
"em_stderr": 0.0004058451132417735,
"f1": 0.058895763422818985,
"f1_stderr": 0.0012985937732460785
},
"harness|gsm8k|5": {
"acc": 0.2744503411675512,
"acc_stderr": 0.012291581170814905
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710674
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__dromedary-65b-lora-HF
|
[
"region:us"
] |
2023-08-18T10:24:23+00:00
|
{"pretty_name": "Evaluation run of TheBloke/dromedary-65b-lora-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/dromedary-65b-lora-HF](https://huggingface.co/TheBloke/dromedary-65b-lora-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__dromedary-65b-lora-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T03:08:41.091963](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__dromedary-65b-lora-HF/blob/main/results_2023-10-15T03-08-41.091963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417735,\n \"f1\": 0.058895763422818985,\n \"f1_stderr\": 0.0012985937732460785,\n \"acc\": 0.5318581619018498,\n \"acc_stderr\": 0.01187681379526279\n },\n \"harness|drop|3\": {\n \"em\": 0.001572986577181208,\n \"em_stderr\": 0.0004058451132417735,\n \"f1\": 0.058895763422818985,\n \"f1_stderr\": 0.0012985937732460785\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2744503411675512,\n \"acc_stderr\": 0.012291581170814905\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710674\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/dromedary-65b-lora-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|arc:challenge|25_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T03_08_41.091963", "path": ["**/details_harness|drop|3_2023-10-15T03-08-41.091963.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T03-08-41.091963.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T03_08_41.091963", "path": ["**/details_harness|gsm8k|5_2023-10-15T03-08-41.091963.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T03-08-41.091963.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hellaswag|10_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-21T02:37:03.243913.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-21T02:37:03.243913.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-21T02:37:03.243913.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T03_08_41.091963", "path": ["**/details_harness|winogrande|5_2023-10-15T03-08-41.091963.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T03-08-41.091963.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_21T02_37_03.243913", "path": ["results_2023-07-21T02:37:03.243913.parquet"]}, {"split": "2023_10_15T03_08_41.091963", "path": ["results_2023-10-15T03-08-41.091963.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T03-08-41.091963.parquet"]}]}]}
|
2023-10-15T02:08:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/dromedary-65b-lora-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/dromedary-65b-lora-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T03:08:41.091963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/dromedary-65b-lora-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/dromedary-65b-lora-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T03:08:41.091963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/dromedary-65b-lora-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/dromedary-65b-lora-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T03:08:41.091963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/dromedary-65b-lora-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/dromedary-65b-lora-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T03:08:41.091963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9cec2204414a269acef0f7efe6f8794b8078d99f
|
# Dataset Card for Evaluation run of TheBloke/guanaco-7B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/guanaco-7B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/guanaco-7B-HF](https://huggingface.co/TheBloke/guanaco-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__guanaco-7B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T00:48:06.944333](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__guanaco-7B-HF/blob/main/results_2023-10-23T00-48-06.944333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413489,
"f1": 0.05533032718120824,
"f1_stderr": 0.001296240126534493,
"acc": 0.38254088595256147,
"acc_stderr": 0.009372441983458353
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413489,
"f1": 0.05533032718120824,
"f1_stderr": 0.001296240126534493
},
"harness|gsm8k|5": {
"acc": 0.05079605761940864,
"acc_stderr": 0.006048352096878091
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__guanaco-7B-HF
|
[
"region:us"
] |
2023-08-18T10:24:31+00:00
|
{"pretty_name": "Evaluation run of TheBloke/guanaco-7B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/guanaco-7B-HF](https://huggingface.co/TheBloke/guanaco-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__guanaco-7B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T00:48:06.944333](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__guanaco-7B-HF/blob/main/results_2023-10-23T00-48-06.944333.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413489,\n \"f1\": 0.05533032718120824,\n \"f1_stderr\": 0.001296240126534493,\n \"acc\": 0.38254088595256147,\n \"acc_stderr\": 0.009372441983458353\n },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413489,\n \"f1\": 0.05533032718120824,\n \"f1_stderr\": 0.001296240126534493\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05079605761940864,\n \"acc_stderr\": 0.006048352096878091\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038616\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/guanaco-7B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T00_48_06.944333", "path": ["**/details_harness|drop|3_2023-10-23T00-48-06.944333.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T00-48-06.944333.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T00_48_06.944333", "path": ["**/details_harness|gsm8k|5_2023-10-23T00-48-06.944333.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T00-48-06.944333.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:53:22.829156.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:53:22.829156.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:53:22.829156.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T00_48_06.944333", "path": ["**/details_harness|winogrande|5_2023-10-23T00-48-06.944333.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T00-48-06.944333.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_53_22.829156", "path": ["results_2023-07-19T16:53:22.829156.parquet"]}, {"split": "2023_10_23T00_48_06.944333", "path": ["results_2023-10-23T00-48-06.944333.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T00-48-06.944333.parquet"]}]}]}
|
2023-10-22T23:48:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/guanaco-7B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/guanaco-7B-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T00:48:06.944333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/guanaco-7B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T00:48:06.944333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/guanaco-7B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T00:48:06.944333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/guanaco-7B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T00:48:06.944333(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
266a7d8a44b9455b543d8d13950ef7ae86f013bc
|
# Dataset Card for Evaluation run of TheBloke/gpt4-x-vicuna-13B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/gpt4-x-vicuna-13B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/gpt4-x-vicuna-13B-HF](https://huggingface.co/TheBloke/gpt4-x-vicuna-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__gpt4-x-vicuna-13B-HF",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T19:01:51.030763](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__gpt4-x-vicuna-13B-HF/blob/main/results_2023-07-19T19%3A01%3A51.030763.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5137597162733054,
"acc_stderr": 0.03484317305077308,
"acc_norm": 0.5174954549900392,
"acc_norm_stderr": 0.03482742951911445,
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5357942440986606,
"mc2_stderr": 0.015916184024373756
},
"harness|arc:challenge|25": {
"acc": 0.5110921501706485,
"acc_stderr": 0.01460779491401305,
"acc_norm": 0.5341296928327645,
"acc_norm_stderr": 0.014577311315231104
},
"harness|hellaswag|10": {
"acc": 0.6038637721569409,
"acc_stderr": 0.004880937933163287,
"acc_norm": 0.8012348137821151,
"acc_norm_stderr": 0.003982553164086259
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.03742461193887249,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.03742461193887249
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.03141082197596241,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.03141082197596241
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.28835978835978837,
"acc_stderr": 0.023330654054535896,
"acc_norm": 0.28835978835978837,
"acc_norm_stderr": 0.023330654054535896
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165894,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103873,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103873
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6424242424242425,
"acc_stderr": 0.037425970438065864,
"acc_norm": 0.6424242424242425,
"acc_norm_stderr": 0.037425970438065864
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.034273086529999344,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.034273086529999344
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7046632124352331,
"acc_stderr": 0.03292296639155141,
"acc_norm": 0.7046632124352331,
"acc_norm_stderr": 0.03292296639155141
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43846153846153846,
"acc_stderr": 0.02515826601686857,
"acc_norm": 0.43846153846153846,
"acc_norm_stderr": 0.02515826601686857
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766135,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766135
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.0322529423239964,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.0322529423239964
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6770642201834862,
"acc_stderr": 0.02004811592341531,
"acc_norm": 0.6770642201834862,
"acc_norm_stderr": 0.02004811592341531
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.36574074074074076,
"acc_stderr": 0.03284738857647206,
"acc_norm": 0.36574074074074076,
"acc_norm_stderr": 0.03284738857647206
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.03296245110172228,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.03296245110172228
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935574,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935574
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280041,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280041
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7735042735042735,
"acc_stderr": 0.027421007295392912,
"acc_norm": 0.7735042735042735,
"acc_norm_stderr": 0.027421007295392912
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6871008939974457,
"acc_stderr": 0.01658093594030406,
"acc_norm": 0.6871008939974457,
"acc_norm_stderr": 0.01658093594030406
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31731843575418994,
"acc_stderr": 0.01556639263005703,
"acc_norm": 0.31731843575418994,
"acc_norm_stderr": 0.01556639263005703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5424836601307189,
"acc_stderr": 0.028526383452142638,
"acc_norm": 0.5424836601307189,
"acc_norm_stderr": 0.028526383452142638
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5530546623794212,
"acc_stderr": 0.02823776942208535,
"acc_norm": 0.5530546623794212,
"acc_norm_stderr": 0.02823776942208535
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.027513747284379424,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.027513747284379424
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.02904919034254346,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.02904919034254346
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41264667535853977,
"acc_stderr": 0.012573836633799015,
"acc_norm": 0.41264667535853977,
"acc_norm_stderr": 0.012573836633799015
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.020212274976302957,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.020212274976302957
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972743,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972743
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5795918367346938,
"acc_stderr": 0.03160106993449601,
"acc_norm": 0.5795918367346938,
"acc_norm_stderr": 0.03160106993449601
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.030965903123573033,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.030965903123573033
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7426900584795322,
"acc_stderr": 0.03352799844161865,
"acc_norm": 0.7426900584795322,
"acc_norm_stderr": 0.03352799844161865
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3635250917992656,
"mc1_stderr": 0.016838862883965827,
"mc2": 0.5357942440986606,
"mc2_stderr": 0.015916184024373756
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__gpt4-x-vicuna-13B-HF
|
[
"region:us"
] |
2023-08-18T10:24:41+00:00
|
{"pretty_name": "Evaluation run of TheBloke/gpt4-x-vicuna-13B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/gpt4-x-vicuna-13B-HF](https://huggingface.co/TheBloke/gpt4-x-vicuna-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__gpt4-x-vicuna-13B-HF\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-19T19:01:51.030763](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__gpt4-x-vicuna-13B-HF/blob/main/results_2023-07-19T19%3A01%3A51.030763.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5137597162733054,\n \"acc_stderr\": 0.03484317305077308,\n \"acc_norm\": 0.5174954549900392,\n \"acc_norm_stderr\": 0.03482742951911445,\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5357942440986606,\n \"mc2_stderr\": 0.015916184024373756\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5110921501706485,\n \"acc_stderr\": 0.01460779491401305,\n \"acc_norm\": 0.5341296928327645,\n \"acc_norm_stderr\": 0.014577311315231104\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6038637721569409,\n \"acc_stderr\": 0.004880937933163287,\n \"acc_norm\": 0.8012348137821151,\n \"acc_norm_stderr\": 0.003982553164086259\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.03742461193887249,\n \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.03742461193887249\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.03141082197596241,\n \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.03141082197596241\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.28835978835978837,\n \"acc_stderr\": 0.023330654054535896,\n \"acc_norm\": 0.28835978835978837,\n \"acc_norm_stderr\": 0.023330654054535896\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n \"acc_stderr\": 0.028129112709165894,\n \"acc_norm\": 0.5741935483870968,\n \"acc_norm_stderr\": 0.028129112709165894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103873,\n \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103873\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6424242424242425,\n \"acc_stderr\": 0.037425970438065864,\n \"acc_norm\": 0.6424242424242425,\n \"acc_norm_stderr\": 0.037425970438065864\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.034273086529999344,\n \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.034273086529999344\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7046632124352331,\n \"acc_stderr\": 0.03292296639155141,\n \"acc_norm\": 0.7046632124352331,\n \"acc_norm_stderr\": 0.03292296639155141\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.43846153846153846,\n \"acc_stderr\": 0.02515826601686857,\n \"acc_norm\": 0.43846153846153846,\n \"acc_norm_stderr\": 0.02515826601686857\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766135,\n \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766135\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.0322529423239964,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.0322529423239964\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6770642201834862,\n \"acc_stderr\": 0.02004811592341531,\n \"acc_norm\": 0.6770642201834862,\n \"acc_norm_stderr\": 0.02004811592341531\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.36574074074074076,\n \"acc_stderr\": 0.03284738857647206,\n \"acc_norm\": 0.36574074074074076,\n \"acc_norm_stderr\": 0.03284738857647206\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.03296245110172228,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.03296245110172228\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7046413502109705,\n \"acc_stderr\": 0.02969633871342288,\n \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.02969633871342288\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.5739910313901345,\n \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935574,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935574\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280041,\n \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280041\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7735042735042735,\n \"acc_stderr\": 0.027421007295392912,\n \"acc_norm\": 0.7735042735042735,\n \"acc_norm_stderr\": 0.027421007295392912\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6871008939974457,\n \"acc_stderr\": 0.01658093594030406,\n \"acc_norm\": 0.6871008939974457,\n \"acc_norm_stderr\": 0.01658093594030406\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.026842985519615375,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.026842985519615375\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31731843575418994,\n \"acc_stderr\": 0.01556639263005703,\n \"acc_norm\": 0.31731843575418994,\n \"acc_norm_stderr\": 0.01556639263005703\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5424836601307189,\n \"acc_stderr\": 0.028526383452142638,\n \"acc_norm\": 0.5424836601307189,\n \"acc_norm_stderr\": 0.028526383452142638\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5530546623794212,\n \"acc_stderr\": 0.02823776942208535,\n \"acc_norm\": 0.5530546623794212,\n \"acc_norm_stderr\": 0.02823776942208535\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.027513747284379424,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.027513747284379424\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.38652482269503546,\n \"acc_stderr\": 0.02904919034254346,\n \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.02904919034254346\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41264667535853977,\n \"acc_stderr\": 0.012573836633799015,\n \"acc_norm\": 0.41264667535853977,\n \"acc_norm_stderr\": 0.012573836633799015\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.020212274976302957,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.020212274976302957\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n \"acc_stderr\": 0.04769300568972743,\n \"acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.04769300568972743\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5795918367346938,\n \"acc_stderr\": 0.03160106993449601,\n \"acc_norm\": 0.5795918367346938,\n \"acc_norm_stderr\": 0.03160106993449601\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n \"acc_stderr\": 0.030965903123573033,\n \"acc_norm\": 0.7412935323383084,\n \"acc_norm_stderr\": 0.030965903123573033\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.45180722891566266,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7426900584795322,\n \"acc_stderr\": 0.03352799844161865,\n \"acc_norm\": 0.7426900584795322,\n \"acc_norm_stderr\": 0.03352799844161865\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3635250917992656,\n \"mc1_stderr\": 0.016838862883965827,\n \"mc2\": 0.5357942440986606,\n \"mc2_stderr\": 0.015916184024373756\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/gpt4-x-vicuna-13B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:01:51.030763.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:01:51.030763.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_01_51.030763", "path": ["results_2023-07-19T19:01:51.030763.parquet"]}, {"split": "latest", "path": ["results_2023-07-19T19:01:51.030763.parquet"]}]}]}
|
2023-08-27T11:33:22+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/gpt4-x-vicuna-13B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/gpt4-x-vicuna-13B-HF on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-07-19T19:01:51.030763 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/gpt4-x-vicuna-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-x-vicuna-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-19T19:01:51.030763 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/gpt4-x-vicuna-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-x-vicuna-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-19T19:01:51.030763 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/gpt4-x-vicuna-13B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-x-vicuna-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-19T19:01:51.030763 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2a43f5c45bd46245e2c50abbb201b0d9d593e428
|
# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T05:46:44.212362](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16/blob/main/results_2023-10-22T05-46-44.212362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2419253355704698,
"em_stderr": 0.004385673721154169,
"f1": 0.30457843959731623,
"f1_stderr": 0.00439090225052454,
"acc": 0.38382232120791804,
"acc_stderr": 0.007195680070781476
},
"harness|drop|3": {
"em": 0.2419253355704698,
"em_stderr": 0.004385673721154169,
"f1": 0.30457843959731623,
"f1_stderr": 0.00439090225052454
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772092
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.012002078629485742
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16
|
[
"region:us"
] |
2023-08-18T10:24:50+00:00
|
{"pretty_name": "Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T05:46:44.212362](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-13B-V1-1-SuperHOT-8K-fp16/blob/main/results_2023-10-22T05-46-44.212362.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2419253355704698,\n \"em_stderr\": 0.004385673721154169,\n \"f1\": 0.30457843959731623,\n \"f1_stderr\": 0.00439090225052454,\n \"acc\": 0.38382232120791804,\n \"acc_stderr\": 0.007195680070781476\n },\n \"harness|drop|3\": {\n \"em\": 0.2419253355704698,\n \"em_stderr\": 0.004385673721154169,\n \"f1\": 0.30457843959731623,\n \"f1_stderr\": 0.00439090225052454\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.0023892815120772092\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.012002078629485742\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T05_46_44.212362", "path": ["**/details_harness|drop|3_2023-10-22T05-46-44.212362.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T05-46-44.212362.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T05_46_44.212362", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-46-44.212362.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-46-44.212362.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:56:27.012351.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:56:27.012351.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T05_46_44.212362", "path": ["**/details_harness|winogrande|5_2023-10-22T05-46-44.212362.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T05-46-44.212362.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T13_56_27.012351", "path": ["results_2023-08-01T13:56:27.012351.parquet"]}, {"split": "2023_10_22T05_46_44.212362", "path": ["results_2023-10-22T05-46-44.212362.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T05-46-44.212362.parquet"]}]}]}
|
2023-10-22T04:46:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T05:46:44.212362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:46:44.212362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:46:44.212362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
33,
31,
181,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/WizardLM-13B-V1-1-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T05:46:44.212362(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
7e7ed0b1336d88dc4ab4b43ea9ddce5e9b6fd8c4
|
# Dataset Card for Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-31T19:21:09.032023](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A21%3A09.032023.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24079112101610886,
"acc_stderr": 0.030961801782247226,
"acc_norm": 0.24208994950215265,
"acc_norm_stderr": 0.03097894827141845,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931588,
"mc2": 0.4774590793334822,
"mc2_stderr": 0.01691343346185639
},
"harness|arc:challenge|25": {
"acc": 0.2175767918088737,
"acc_stderr": 0.0120572620209725,
"acc_norm": 0.26791808873720135,
"acc_norm_stderr": 0.012942030195136426
},
"harness|hellaswag|10": {
"acc": 0.26926906990639315,
"acc_stderr": 0.004426734718808876,
"acc_norm": 0.29555865365465045,
"acc_norm_stderr": 0.004553609405747228
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.035914440841969694,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.035914440841969694
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.025447863825108608,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.025447863825108608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2275132275132275,
"acc_stderr": 0.021591269407823778,
"acc_norm": 0.2275132275132275,
"acc_norm_stderr": 0.021591269407823778
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.02468597928623997,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.02468597928623997
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586804,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936087,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936087
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395592,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395592
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.029996951858349497,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.029996951858349497
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26947637292464877,
"acc_stderr": 0.01586624307321506,
"acc_norm": 0.26947637292464877,
"acc_norm_stderr": 0.01586624307321506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500114,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500114
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.02355083135199509,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.02355083135199509
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.024847921358063962,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.024847921358063962
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142314,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.20408163265306123,
"acc_stderr": 0.02580128347509051,
"acc_norm": 0.20408163265306123,
"acc_norm_stderr": 0.02580128347509051
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931588,
"mc2": 0.4774590793334822,
"mc2_stderr": 0.01691343346185639
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16
|
[
"region:us"
] |
2023-08-18T10:24:58+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-31T19:21:09.032023](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Chinese-Alpaca-33B-SuperHOT-8K-fp16/blob/main/results_2023-07-31T19%3A21%3A09.032023.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24079112101610886,\n \"acc_stderr\": 0.030961801782247226,\n \"acc_norm\": 0.24208994950215265,\n \"acc_norm_stderr\": 0.03097894827141845,\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931588,\n \"mc2\": 0.4774590793334822,\n \"mc2_stderr\": 0.01691343346185639\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.2175767918088737,\n \"acc_stderr\": 0.0120572620209725,\n \"acc_norm\": 0.26791808873720135,\n \"acc_norm_stderr\": 0.012942030195136426\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26926906990639315,\n \"acc_stderr\": 0.004426734718808876,\n \"acc_norm\": 0.29555865365465045,\n \"acc_norm_stderr\": 0.004553609405747228\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.035914440841969694,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.035914440841969694\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.025447863825108608,\n \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.025447863825108608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2275132275132275,\n \"acc_stderr\": 0.021591269407823778,\n \"acc_norm\": 0.2275132275132275,\n \"acc_norm_stderr\": 0.021591269407823778\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.02468597928623997,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.02468597928623997\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586804,\n \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586804\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775296,\n \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775296\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936087,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936087\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.03058759135160425,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.03058759135160425\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395592,\n \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395592\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.28699551569506726,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n \"acc_stderr\": 0.029996951858349497,\n \"acc_norm\": 0.29914529914529914,\n \"acc_norm_stderr\": 0.029996951858349497\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26947637292464877,\n \"acc_stderr\": 0.01586624307321506,\n \"acc_norm\": 0.26947637292464877,\n \"acc_norm_stderr\": 0.01586624307321506\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500114,\n \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500114\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.02355083135199509,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.02355083135199509\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.22340425531914893,\n \"acc_stderr\": 0.024847921358063962,\n \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.024847921358063962\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142314,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.20408163265306123,\n \"acc_stderr\": 0.02580128347509051,\n \"acc_norm\": 0.20408163265306123,\n \"acc_norm_stderr\": 0.02580128347509051\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n \"mc1_stderr\": 0.014816195991931588,\n \"mc2\": 0.4774590793334822,\n \"mc2_stderr\": 0.01691343346185639\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|arc:challenge|25_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hellaswag|10_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T19:21:09.032023.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T19_21_09.032023", "path": ["results_2023-07-31T19:21:09.032023.parquet"]}, {"split": "latest", "path": ["results_2023-07-31T19:21:09.032023.parquet"]}]}]}
|
2023-08-27T11:33:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-07-31T19:21:09.032023 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-31T19:21:09.032023 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-31T19:21:09.032023 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
32,
31,
180,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Chinese-Alpaca-33B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-31T19:21:09.032023 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
00369c557cdfd39d679c7d71fb1e25926260d884
|
# Dataset Card for Evaluation run of TheBloke/wizardLM-7B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/wizardLM-7B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/wizardLM-7B-HF](https://huggingface.co/TheBloke/wizardLM-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-18T11:33:18.439367](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF/blob/main/results_2023-07-18T11%3A33%3A18.439367.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.38566819917906325,
"acc_stderr": 0.03482242619787474,
"acc_norm": 0.3891088361419288,
"acc_norm_stderr": 0.03481173503822327,
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317919,
"mc2": 0.45584096136441793,
"mc2_stderr": 0.016028055350830416
},
"harness|arc:challenge|25": {
"acc": 0.48464163822525597,
"acc_stderr": 0.014604496129394913,
"acc_norm": 0.5034129692832765,
"acc_norm_stderr": 0.014611050403244081
},
"harness|hellaswag|10": {
"acc": 0.5685122485560645,
"acc_stderr": 0.004942716091996078,
"acc_norm": 0.7527384983071101,
"acc_norm_stderr": 0.004305383398710189
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.042849586397534,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.042849586397534
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40131578947368424,
"acc_stderr": 0.03988903703336284,
"acc_norm": 0.40131578947368424,
"acc_norm_stderr": 0.03988903703336284
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4377358490566038,
"acc_stderr": 0.03053333843046751,
"acc_norm": 0.4377358490566038,
"acc_norm_stderr": 0.03053333843046751
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3680555555555556,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.3680555555555556,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.32413793103448274,
"acc_stderr": 0.03900432069185555,
"acc_norm": 0.32413793103448274,
"acc_norm_stderr": 0.03900432069185555
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.02375292871211214,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.02375292871211214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604675,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604675
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.36129032258064514,
"acc_stderr": 0.02732754844795754,
"acc_norm": 0.36129032258064514,
"acc_norm_stderr": 0.02732754844795754
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03888176921674099,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03888176921674099
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.42424242424242425,
"acc_stderr": 0.03521224908841583,
"acc_norm": 0.42424242424242425,
"acc_norm_stderr": 0.03521224908841583
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.46632124352331605,
"acc_stderr": 0.03600244069867178,
"acc_norm": 0.46632124352331605,
"acc_norm_stderr": 0.03600244069867178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.35384615384615387,
"acc_stderr": 0.024243783994062164,
"acc_norm": 0.35384615384615387,
"acc_norm_stderr": 0.024243783994062164
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763743,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763743
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.46605504587155966,
"acc_stderr": 0.021387863350353992,
"acc_norm": 0.46605504587155966,
"acc_norm_stderr": 0.021387863350353992
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859672,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859672
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45588235294117646,
"acc_stderr": 0.03495624522015474,
"acc_norm": 0.45588235294117646,
"acc_norm_stderr": 0.03495624522015474
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.43037974683544306,
"acc_stderr": 0.03223017195937597,
"acc_norm": 0.43037974683544306,
"acc_norm_stderr": 0.03223017195937597
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3893129770992366,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.3893129770992366,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5785123966942148,
"acc_stderr": 0.04507732278775087,
"acc_norm": 0.5785123966942148,
"acc_norm_stderr": 0.04507732278775087
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536821,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536821
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3987730061349693,
"acc_stderr": 0.038470214204560246,
"acc_norm": 0.3987730061349693,
"acc_norm_stderr": 0.038470214204560246
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.047504583990416946,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.047504583990416946
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5170940170940171,
"acc_stderr": 0.032736940493481824,
"acc_norm": 0.5170940170940171,
"acc_norm_stderr": 0.032736940493481824
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.545338441890166,
"acc_stderr": 0.017806304585052602,
"acc_norm": 0.545338441890166,
"acc_norm_stderr": 0.017806304585052602
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.38439306358381503,
"acc_stderr": 0.026189666966272035,
"acc_norm": 0.38439306358381503,
"acc_norm_stderr": 0.026189666966272035
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23016759776536314,
"acc_stderr": 0.014078339253425819,
"acc_norm": 0.23016759776536314,
"acc_norm_stderr": 0.014078339253425819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4084967320261438,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.4084967320261438,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3858520900321543,
"acc_stderr": 0.027648149599751457,
"acc_norm": 0.3858520900321543,
"acc_norm_stderr": 0.027648149599751457
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320193,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320193
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3220338983050847,
"acc_stderr": 0.01193393607189109,
"acc_norm": 0.3220338983050847,
"acc_norm_stderr": 0.01193393607189109
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3860294117647059,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.3860294117647059,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.40032679738562094,
"acc_stderr": 0.019821843688271765,
"acc_norm": 0.40032679738562094,
"acc_norm_stderr": 0.019821843688271765
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.41818181818181815,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.41818181818181815,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3142857142857143,
"acc_stderr": 0.029719329422417482,
"acc_norm": 0.3142857142857143,
"acc_norm_stderr": 0.029719329422417482
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.47761194029850745,
"acc_stderr": 0.035319879302087305,
"acc_norm": 0.47761194029850745,
"acc_norm_stderr": 0.035319879302087305
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479637,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479637
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5380116959064327,
"acc_stderr": 0.038237270928823064,
"acc_norm": 0.5380116959064327,
"acc_norm_stderr": 0.038237270928823064
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31456548347613217,
"mc1_stderr": 0.01625524199317919,
"mc2": 0.45584096136441793,
"mc2_stderr": 0.016028055350830416
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF
|
[
"region:us"
] |
2023-08-18T10:25:07+00:00
|
{"pretty_name": "Evaluation run of TheBloke/wizardLM-7B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/wizardLM-7B-HF](https://huggingface.co/TheBloke/wizardLM-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-18T11:33:18.439367](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizardLM-7B-HF/blob/main/results_2023-07-18T11%3A33%3A18.439367.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.38566819917906325,\n \"acc_stderr\": 0.03482242619787474,\n \"acc_norm\": 0.3891088361419288,\n \"acc_norm_stderr\": 0.03481173503822327,\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.01625524199317919,\n \"mc2\": 0.45584096136441793,\n \"mc2_stderr\": 0.016028055350830416\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.48464163822525597,\n \"acc_stderr\": 0.014604496129394913,\n \"acc_norm\": 0.5034129692832765,\n \"acc_norm_stderr\": 0.014611050403244081\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5685122485560645,\n \"acc_stderr\": 0.004942716091996078,\n \"acc_norm\": 0.7527384983071101,\n \"acc_norm_stderr\": 0.004305383398710189\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.042849586397534,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.042849586397534\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.40131578947368424,\n \"acc_stderr\": 0.03988903703336284,\n \"acc_norm\": 0.40131578947368424,\n \"acc_norm_stderr\": 0.03988903703336284\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.4377358490566038,\n \"acc_stderr\": 0.03053333843046751,\n \"acc_norm\": 0.4377358490566038,\n \"acc_norm_stderr\": 0.03053333843046751\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3680555555555556,\n \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.3680555555555556,\n \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.35260115606936415,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.32413793103448274,\n \"acc_stderr\": 0.03900432069185555,\n \"acc_norm\": 0.32413793103448274,\n \"acc_norm_stderr\": 0.03900432069185555\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211214,\n \"acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211214\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604675,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604675\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.36129032258064514,\n \"acc_stderr\": 0.02732754844795754,\n \"acc_norm\": 0.36129032258064514,\n \"acc_norm_stderr\": 0.02732754844795754\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.45454545454545453,\n \"acc_stderr\": 0.03888176921674099,\n \"acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03888176921674099\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.42424242424242425,\n \"acc_stderr\": 0.03521224908841583,\n \"acc_norm\": 0.42424242424242425,\n \"acc_norm_stderr\": 0.03521224908841583\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.46632124352331605,\n \"acc_stderr\": 0.03600244069867178,\n \"acc_norm\": 0.46632124352331605,\n \"acc_norm_stderr\": 0.03600244069867178\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.35384615384615387,\n \"acc_stderr\": 0.024243783994062164,\n \"acc_norm\": 0.35384615384615387,\n \"acc_norm_stderr\": 0.024243783994062164\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763743,\n \"acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763743\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.46605504587155966,\n \"acc_stderr\": 0.021387863350353992,\n \"acc_norm\": 0.46605504587155966,\n \"acc_norm_stderr\": 0.021387863350353992\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859672,\n \"acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859672\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.45588235294117646,\n \"acc_stderr\": 0.03495624522015474,\n \"acc_norm\": 0.45588235294117646,\n \"acc_norm_stderr\": 0.03495624522015474\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.43037974683544306,\n \"acc_stderr\": 0.03223017195937597,\n \"acc_norm\": 0.43037974683544306,\n \"acc_norm_stderr\": 0.03223017195937597\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5785123966942148,\n \"acc_stderr\": 0.04507732278775087,\n \"acc_norm\": 0.5785123966942148,\n \"acc_norm_stderr\": 0.04507732278775087\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.04812917324536821,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.04812917324536821\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3987730061349693,\n \"acc_stderr\": 0.038470214204560246,\n \"acc_norm\": 0.3987730061349693,\n \"acc_norm_stderr\": 0.038470214204560246\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.047504583990416946,\n \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.047504583990416946\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5170940170940171,\n \"acc_stderr\": 0.032736940493481824,\n \"acc_norm\": 0.5170940170940171,\n \"acc_norm_stderr\": 0.032736940493481824\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.545338441890166,\n \"acc_stderr\": 0.017806304585052602,\n \"acc_norm\": 0.545338441890166,\n \"acc_norm_stderr\": 0.017806304585052602\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.38439306358381503,\n \"acc_stderr\": 0.026189666966272035,\n \"acc_norm\": 0.38439306358381503,\n \"acc_norm_stderr\": 0.026189666966272035\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23016759776536314,\n \"acc_stderr\": 0.014078339253425819,\n \"acc_norm\": 0.23016759776536314,\n \"acc_norm_stderr\": 0.014078339253425819\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4084967320261438,\n \"acc_stderr\": 0.028146405993096358,\n \"acc_norm\": 0.4084967320261438,\n \"acc_norm_stderr\": 0.028146405993096358\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3858520900321543,\n \"acc_stderr\": 0.027648149599751457,\n \"acc_norm\": 0.3858520900321543,\n \"acc_norm_stderr\": 0.027648149599751457\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.027237415094592477,\n \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.027237415094592477\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320193,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320193\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3220338983050847,\n \"acc_stderr\": 0.01193393607189109,\n \"acc_norm\": 0.3220338983050847,\n \"acc_norm_stderr\": 0.01193393607189109\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3860294117647059,\n \"acc_stderr\": 0.029573269134411124,\n \"acc_norm\": 0.3860294117647059,\n \"acc_norm_stderr\": 0.029573269134411124\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.40032679738562094,\n \"acc_stderr\": 0.019821843688271765,\n \"acc_norm\": 0.40032679738562094,\n \"acc_norm_stderr\": 0.019821843688271765\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.41818181818181815,\n \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.41818181818181815,\n \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417482,\n \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417482\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.47761194029850745,\n \"acc_stderr\": 0.035319879302087305,\n \"acc_norm\": 0.47761194029850745,\n \"acc_norm_stderr\": 0.035319879302087305\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n \"acc_stderr\": 0.03799857454479637,\n \"acc_norm\": 0.39156626506024095,\n \"acc_norm_stderr\": 0.03799857454479637\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5380116959064327,\n \"acc_stderr\": 0.038237270928823064,\n \"acc_norm\": 0.5380116959064327,\n \"acc_norm_stderr\": 0.038237270928823064\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31456548347613217,\n \"mc1_stderr\": 0.01625524199317919,\n \"mc2\": 0.45584096136441793,\n \"mc2_stderr\": 0.016028055350830416\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/wizardLM-7B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T11:33:18.439367.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T11_33_18.439367", "path": ["results_2023-07-18T11:33:18.439367.parquet"]}, {"split": "latest", "path": ["results_2023-07-18T11:33:18.439367.parquet"]}]}]}
|
2023-08-27T11:33:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/wizardLM-7B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/wizardLM-7B-HF on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-07-18T11:33:18.439367 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/wizardLM-7B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizardLM-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-18T11:33:18.439367 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/wizardLM-7B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizardLM-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-18T11:33:18.439367 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/wizardLM-7B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizardLM-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-18T11:33:18.439367 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b884246a2aaf51c9b4a08af2d3999f2a91221e0a
|
# Dataset Card for Evaluation run of TheBloke/tulu-13B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/tulu-13B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/tulu-13B-fp16](https://huggingface.co/TheBloke/tulu-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__tulu-13B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T17:51:25.855725](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__tulu-13B-fp16/blob/main/results_2023-10-22T17-51-25.855725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.33001258389261745,
"em_stderr": 0.004815464931125239,
"f1": 0.367210570469799,
"f1_stderr": 0.004753724357053633,
"acc": 0.4493245163726317,
"acc_stderr": 0.010849255862291012
},
"harness|drop|3": {
"em": 0.33001258389261745,
"em_stderr": 0.004815464931125239,
"f1": 0.367210570469799,
"f1_stderr": 0.004753724357053633
},
"harness|gsm8k|5": {
"acc": 0.1425322213798332,
"acc_stderr": 0.009629588445673824
},
"harness|winogrande|5": {
"acc": 0.7561168113654302,
"acc_stderr": 0.0120689232789082
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__tulu-13B-fp16
|
[
"region:us"
] |
2023-08-18T10:25:16+00:00
|
{"pretty_name": "Evaluation run of TheBloke/tulu-13B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/tulu-13B-fp16](https://huggingface.co/TheBloke/tulu-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__tulu-13B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T17:51:25.855725](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__tulu-13B-fp16/blob/main/results_2023-10-22T17-51-25.855725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.33001258389261745,\n \"em_stderr\": 0.004815464931125239,\n \"f1\": 0.367210570469799,\n \"f1_stderr\": 0.004753724357053633,\n \"acc\": 0.4493245163726317,\n \"acc_stderr\": 0.010849255862291012\n },\n \"harness|drop|3\": {\n \"em\": 0.33001258389261745,\n \"em_stderr\": 0.004815464931125239,\n \"f1\": 0.367210570469799,\n \"f1_stderr\": 0.004753724357053633\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1425322213798332,\n \"acc_stderr\": 0.009629588445673824\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7561168113654302,\n \"acc_stderr\": 0.0120689232789082\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/tulu-13B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T17_51_25.855725", "path": ["**/details_harness|drop|3_2023-10-22T17-51-25.855725.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T17-51-25.855725.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T17_51_25.855725", "path": ["**/details_harness|gsm8k|5_2023-10-22T17-51-25.855725.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T17-51-25.855725.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:52.983892.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:33:52.983892.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:33:52.983892.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T17_51_25.855725", "path": ["**/details_harness|winogrande|5_2023-10-22T17-51-25.855725.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T17-51-25.855725.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_33_52.983892", "path": ["results_2023-07-19T18:33:52.983892.parquet"]}, {"split": "2023_10_22T17_51_25.855725", "path": ["results_2023-10-22T17-51-25.855725.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T17-51-25.855725.parquet"]}]}]}
|
2023-10-22T16:51:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/tulu-13B-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/tulu-13B-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T17:51:25.855725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/tulu-13B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/tulu-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T17:51:25.855725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/tulu-13B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/tulu-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T17:51:25.855725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/tulu-13B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/tulu-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T17:51:25.855725(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c8e56812c5d1b87c8a37f7489132959047fa9044
|
# Dataset Card for Evaluation run of TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Nous-Hermes-13B-SuperHOT-8K-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T21:24:49.496203](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Nous-Hermes-13B-SuperHOT-8K-fp16/blob/main/results_2023-10-22T21-24-49.496203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24779781879194632,
"em_stderr": 0.004421358038007316,
"f1": 0.3203208892617463,
"f1_stderr": 0.004418252169927022,
"acc": 0.3825450746272229,
"acc_stderr": 0.007568348592873263
},
"harness|drop|3": {
"em": 0.24779781879194632,
"em_stderr": 0.004421358038007316,
"f1": 0.3203208892617463,
"f1_stderr": 0.004418252169927022
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.003015294242890953
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855573
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Nous-Hermes-13B-SuperHOT-8K-fp16
|
[
"region:us"
] |
2023-08-18T10:25:24+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Nous-Hermes-13B-SuperHOT-8K-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T21:24:49.496203](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Nous-Hermes-13B-SuperHOT-8K-fp16/blob/main/results_2023-10-22T21-24-49.496203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24779781879194632,\n \"em_stderr\": 0.004421358038007316,\n \"f1\": 0.3203208892617463,\n \"f1_stderr\": 0.004418252169927022,\n \"acc\": 0.3825450746272229,\n \"acc_stderr\": 0.007568348592873263\n },\n \"harness|drop|3\": {\n \"em\": 0.24779781879194632,\n \"em_stderr\": 0.004421358038007316,\n \"f1\": 0.3203208892617463,\n \"f1_stderr\": 0.004418252169927022\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \"acc_stderr\": 0.003015294242890953\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855573\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T21_24_49.496203", "path": ["**/details_harness|drop|3_2023-10-22T21-24-49.496203.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T21-24-49.496203.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T21_24_49.496203", "path": ["**/details_harness|gsm8k|5_2023-10-22T21-24-49.496203.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T21-24-49.496203.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T13:07:54.585648.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:07:54.585648.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T13:07:54.585648.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T21_24_49.496203", "path": ["**/details_harness|winogrande|5_2023-10-22T21-24-49.496203.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T21-24-49.496203.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T13_07_54.585648", "path": ["results_2023-08-01T13:07:54.585648.parquet"]}, {"split": "2023_10_22T21_24_49.496203", "path": ["results_2023-10-22T21-24-49.496203.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T21-24-49.496203.parquet"]}]}]}
|
2023-10-22T20:25:02+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T21:24:49.496203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T21:24:49.496203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T21:24:49.496203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
31,
31,
179,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Nous-Hermes-13B-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T21:24:49.496203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
fedef13a7ed905dc092faadde43a27758caa2c06
|
# Dataset Card for Evaluation run of TheBloke/llama-30b-supercot-SuperHOT-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/llama-30b-supercot-SuperHOT-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/llama-30b-supercot-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/llama-30b-supercot-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__llama-30b-supercot-SuperHOT-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-01T15:49:06.725548](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__llama-30b-supercot-SuperHOT-8K-fp16/blob/main/results_2023-08-01T15%3A49%3A06.725548.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23555810672922425,
"acc_stderr": 0.030884313140032385,
"acc_norm": 0.2365942449124113,
"acc_norm_stderr": 0.030895609482077938,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731608,
"mc2": 0.4704454489388094,
"mc2_stderr": 0.016777097412683316
},
"harness|arc:challenge|25": {
"acc": 0.22866894197952217,
"acc_stderr": 0.012272853582540806,
"acc_norm": 0.25853242320819114,
"acc_norm_stderr": 0.012794553754288686
},
"harness|hellaswag|10": {
"acc": 0.2740489942242581,
"acc_stderr": 0.004451222241494048,
"acc_norm": 0.3053176658036248,
"acc_norm_stderr": 0.004596006250433552
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.021132859182754444,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.021132859182754444
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24838709677419354,
"acc_stderr": 0.02458002892148101,
"acc_norm": 0.24838709677419354,
"acc_norm_stderr": 0.02458002892148101
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18226600985221675,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.18226600985221675,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538794,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538794
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.021020672680827912,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.021020672680827912
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02534809746809783,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02534809746809783
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.22268907563025211,
"acc_stderr": 0.02702543349888238,
"acc_norm": 0.22268907563025211,
"acc_norm_stderr": 0.02702543349888238
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.17880794701986755,
"acc_stderr": 0.031287448506007245,
"acc_norm": 0.17880794701986755,
"acc_norm_stderr": 0.031287448506007245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1761467889908257,
"acc_stderr": 0.01633288239343138,
"acc_norm": 0.1761467889908257,
"acc_norm_stderr": 0.01633288239343138
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18981481481481483,
"acc_stderr": 0.026744714834691926,
"acc_norm": 0.18981481481481483,
"acc_norm_stderr": 0.026744714834691926
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.32286995515695066,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.32286995515695066,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934722,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934722
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23116219667943805,
"acc_stderr": 0.015075523238101091,
"acc_norm": 0.23116219667943805,
"acc_norm_stderr": 0.015075523238101091
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.0248480182638752,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.0248480182638752
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2191358024691358,
"acc_stderr": 0.0230167056402622,
"acc_norm": 0.2191358024691358,
"acc_norm_stderr": 0.0230167056402622
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731608,
"mc2": 0.4704454489388094,
"mc2_stderr": 0.016777097412683316
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__llama-30b-supercot-SuperHOT-8K-fp16
|
[
"region:us"
] |
2023-08-18T10:25:33+00:00
|
{"pretty_name": "Evaluation run of TheBloke/llama-30b-supercot-SuperHOT-8K-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/llama-30b-supercot-SuperHOT-8K-fp16](https://huggingface.co/TheBloke/llama-30b-supercot-SuperHOT-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__llama-30b-supercot-SuperHOT-8K-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-01T15:49:06.725548](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__llama-30b-supercot-SuperHOT-8K-fp16/blob/main/results_2023-08-01T15%3A49%3A06.725548.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23555810672922425,\n \"acc_stderr\": 0.030884313140032385,\n \"acc_norm\": 0.2365942449124113,\n \"acc_norm_stderr\": 0.030895609482077938,\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731608,\n \"mc2\": 0.4704454489388094,\n \"mc2_stderr\": 0.016777097412683316\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22866894197952217,\n \"acc_stderr\": 0.012272853582540806,\n \"acc_norm\": 0.25853242320819114,\n \"acc_norm_stderr\": 0.012794553754288686\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2740489942242581,\n \"acc_stderr\": 0.004451222241494048,\n \"acc_norm\": 0.3053176658036248,\n \"acc_norm_stderr\": 0.004596006250433552\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21428571428571427,\n \"acc_stderr\": 0.021132859182754444,\n \"acc_norm\": 0.21428571428571427,\n \"acc_norm_stderr\": 0.021132859182754444\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24838709677419354,\n \"acc_stderr\": 0.02458002892148101,\n \"acc_norm\": 0.24838709677419354,\n \"acc_norm_stderr\": 0.02458002892148101\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.18226600985221675,\n \"acc_stderr\": 0.02716334085964515,\n \"acc_norm\": 0.18226600985221675,\n \"acc_norm_stderr\": 0.02716334085964515\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538794,\n \"acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538794\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.021020672680827912,\n \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.021020672680827912\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.02534809746809783,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02534809746809783\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.22268907563025211,\n \"acc_stderr\": 0.02702543349888238,\n \"acc_norm\": 0.22268907563025211,\n \"acc_norm_stderr\": 0.02702543349888238\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.17880794701986755,\n \"acc_stderr\": 0.031287448506007245,\n \"acc_norm\": 0.17880794701986755,\n \"acc_norm_stderr\": 0.031287448506007245\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1761467889908257,\n \"acc_stderr\": 0.01633288239343138,\n \"acc_norm\": 0.1761467889908257,\n \"acc_norm_stderr\": 0.01633288239343138\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.18981481481481483,\n \"acc_stderr\": 0.026744714834691926,\n \"acc_norm\": 0.18981481481481483,\n \"acc_norm_stderr\": 0.026744714834691926\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.32286995515695066,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.32286995515695066,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934722,\n \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934722\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23116219667943805,\n \"acc_stderr\": 0.015075523238101091,\n \"acc_norm\": 0.23116219667943805,\n \"acc_norm_stderr\": 0.015075523238101091\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.0248480182638752,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.0248480182638752\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2191358024691358,\n \"acc_stderr\": 0.0230167056402622,\n \"acc_norm\": 0.2191358024691358,\n \"acc_norm_stderr\": 0.0230167056402622\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731608,\n \"mc2\": 0.4704454489388094,\n \"mc2_stderr\": 0.016777097412683316\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/llama-30b-supercot-SuperHOT-8K-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|arc:challenge|25_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hellaswag|10_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T15:49:06.725548.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T15:49:06.725548.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T15_49_06.725548", "path": ["results_2023-08-01T15:49:06.725548.parquet"]}, {"split": "latest", "path": ["results_2023-08-01T15:49:06.725548.parquet"]}]}]}
|
2023-08-27T11:33:32+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/llama-30b-supercot-SuperHOT-8K-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/llama-30b-supercot-SuperHOT-8K-fp16 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-01T15:49:06.725548 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/llama-30b-supercot-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/llama-30b-supercot-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-01T15:49:06.725548 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/llama-30b-supercot-SuperHOT-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/llama-30b-supercot-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-01T15:49:06.725548 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
31,
31,
179,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/llama-30b-supercot-SuperHOT-8K-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/llama-30b-supercot-SuperHOT-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-01T15:49:06.725548 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
be5f173207da569431acaae296ae02719627d770
|
# Dataset Card for Evaluation run of TheBloke/wizardLM-13B-1.0-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/wizardLM-13B-1.0-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/wizardLM-13B-1.0-fp16](https://huggingface.co/TheBloke/wizardLM-13B-1.0-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__wizardLM-13B-1.0-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T22:13:20.355454](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizardLM-13B-1.0-fp16/blob/main/results_2023-10-22T22-13-20.355454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06638003355704698,
"em_stderr": 0.0025494321051837475,
"f1": 0.14066380033557005,
"f1_stderr": 0.00288504029268502,
"acc": 0.439931114253282,
"acc_stderr": 0.010916082865895764
},
"harness|drop|3": {
"em": 0.06638003355704698,
"em_stderr": 0.0025494321051837475,
"f1": 0.14066380033557005,
"f1_stderr": 0.00288504029268502
},
"harness|gsm8k|5": {
"acc": 0.13874147081122062,
"acc_stderr": 0.009521649920798148
},
"harness|winogrande|5": {
"acc": 0.7411207576953434,
"acc_stderr": 0.01231051581099338
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__wizardLM-13B-1.0-fp16
|
[
"region:us"
] |
2023-08-18T10:25:41+00:00
|
{"pretty_name": "Evaluation run of TheBloke/wizardLM-13B-1.0-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/wizardLM-13B-1.0-fp16](https://huggingface.co/TheBloke/wizardLM-13B-1.0-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__wizardLM-13B-1.0-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T22:13:20.355454](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizardLM-13B-1.0-fp16/blob/main/results_2023-10-22T22-13-20.355454.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06638003355704698,\n \"em_stderr\": 0.0025494321051837475,\n \"f1\": 0.14066380033557005,\n \"f1_stderr\": 0.00288504029268502,\n \"acc\": 0.439931114253282,\n \"acc_stderr\": 0.010916082865895764\n },\n \"harness|drop|3\": {\n \"em\": 0.06638003355704698,\n \"em_stderr\": 0.0025494321051837475,\n \"f1\": 0.14066380033557005,\n \"f1_stderr\": 0.00288504029268502\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13874147081122062,\n \"acc_stderr\": 0.009521649920798148\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.01231051581099338\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/wizardLM-13B-1.0-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T22_13_20.355454", "path": ["**/details_harness|drop|3_2023-10-22T22-13-20.355454.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T22-13-20.355454.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T22_13_20.355454", "path": ["**/details_harness|gsm8k|5_2023-10-22T22-13-20.355454.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T22-13-20.355454.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:43.498686.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:39:43.498686.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:39:43.498686.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T22_13_20.355454", "path": ["**/details_harness|winogrande|5_2023-10-22T22-13-20.355454.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T22-13-20.355454.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_39_43.498686", "path": ["results_2023-07-19T19:39:43.498686.parquet"]}, {"split": "2023_10_22T22_13_20.355454", "path": ["results_2023-10-22T22-13-20.355454.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T22-13-20.355454.parquet"]}]}]}
|
2023-10-22T21:13:32+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/wizardLM-13B-1.0-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/wizardLM-13B-1.0-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T22:13:20.355454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/wizardLM-13B-1.0-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizardLM-13B-1.0-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T22:13:20.355454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/wizardLM-13B-1.0-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizardLM-13B-1.0-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T22:13:20.355454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/wizardLM-13B-1.0-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizardLM-13B-1.0-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T22:13:20.355454(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ecc9fc8ab36d15fc2e0bb39d1da5a4ea268adda9
|
# Dataset Card for Evaluation run of TheBloke/UltraLM-13B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/UltraLM-13B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/UltraLM-13B-fp16](https://huggingface.co/TheBloke/UltraLM-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T20:20:20.923100](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16/blob/main/results_2023-10-22T20-20-20.923100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.01363255033557047,
"em_stderr": 0.0011875381552413013,
"f1": 0.08585046140939587,
"f1_stderr": 0.0018748006407108256,
"acc": 0.43269188767410677,
"acc_stderr": 0.010269983173766185
},
"harness|drop|3": {
"em": 0.01363255033557047,
"em_stderr": 0.0011875381552413013,
"f1": 0.08585046140939587,
"f1_stderr": 0.0018748006407108256
},
"harness|gsm8k|5": {
"acc": 0.1068991660348749,
"acc_stderr": 0.008510982565520497
},
"harness|winogrande|5": {
"acc": 0.7584846093133386,
"acc_stderr": 0.012028983782011875
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16
|
[
"region:us"
] |
2023-08-18T10:25:50+00:00
|
{"pretty_name": "Evaluation run of TheBloke/UltraLM-13B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/UltraLM-13B-fp16](https://huggingface.co/TheBloke/UltraLM-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T20:20:20.923100](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__UltraLM-13B-fp16/blob/main/results_2023-10-22T20-20-20.923100.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.01363255033557047,\n \"em_stderr\": 0.0011875381552413013,\n \"f1\": 0.08585046140939587,\n \"f1_stderr\": 0.0018748006407108256,\n \"acc\": 0.43269188767410677,\n \"acc_stderr\": 0.010269983173766185\n },\n \"harness|drop|3\": {\n \"em\": 0.01363255033557047,\n \"em_stderr\": 0.0011875381552413013,\n \"f1\": 0.08585046140939587,\n \"f1_stderr\": 0.0018748006407108256\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1068991660348749,\n \"acc_stderr\": 0.008510982565520497\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7584846093133386,\n \"acc_stderr\": 0.012028983782011875\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/UltraLM-13B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T20_20_20.923100", "path": ["**/details_harness|drop|3_2023-10-22T20-20-20.923100.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T20-20-20.923100.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T20_20_20.923100", "path": ["**/details_harness|gsm8k|5_2023-10-22T20-20-20.923100.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T20-20-20.923100.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:33:28.322265.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:33:28.322265.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T20_20_20.923100", "path": ["**/details_harness|winogrande|5_2023-10-22T20-20-20.923100.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T20-20-20.923100.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_33_28.322265", "path": ["results_2023-07-19T19:33:28.322265.parquet"]}, {"split": "2023_10_22T20_20_20.923100", "path": ["results_2023-10-22T20-20-20.923100.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T20-20-20.923100.parquet"]}]}]}
|
2023-10-22T19:20:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/UltraLM-13B-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/UltraLM-13B-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T20:20:20.923100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/UltraLM-13B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/UltraLM-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T20:20:20.923100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/UltraLM-13B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/UltraLM-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T20:20:20.923100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/UltraLM-13B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/UltraLM-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T20:20:20.923100(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
200c0d2ab7f354fc87755da1cc9617da42b67ab7
|
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-13B-Uncensored-HF](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T01:03:04.641003](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF/blob/main/results_2023-10-23T01-03-04.641003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14314177852348994,
"em_stderr": 0.0035865537174832513,
"f1": 0.2178586409395965,
"f1_stderr": 0.003730334446277459,
"acc": 0.4216675951562166,
"acc_stderr": 0.00989785498376742
},
"harness|drop|3": {
"em": 0.14314177852348994,
"em_stderr": 0.0035865537174832513,
"f1": 0.2178586409395965,
"f1_stderr": 0.003730334446277459
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.0077400443371038056
},
"harness|winogrande|5": {
"acc": 0.7569060773480663,
"acc_stderr": 0.012055665630431032
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF
|
[
"region:us"
] |
2023-08-18T10:26:00+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-13B-Uncensored-HF](https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T01:03:04.641003](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-13B-Uncensored-HF/blob/main/results_2023-10-23T01-03-04.641003.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14314177852348994,\n \"em_stderr\": 0.0035865537174832513,\n \"f1\": 0.2178586409395965,\n \"f1_stderr\": 0.003730334446277459,\n \"acc\": 0.4216675951562166,\n \"acc_stderr\": 0.00989785498376742\n },\n \"harness|drop|3\": {\n \"em\": 0.14314177852348994,\n \"em_stderr\": 0.0035865537174832513,\n \"f1\": 0.2178586409395965,\n \"f1_stderr\": 0.003730334446277459\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.0077400443371038056\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7569060773480663,\n \"acc_stderr\": 0.012055665630431032\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Wizard-Vicuna-13B-Uncensored-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T01_03_04.641003", "path": ["**/details_harness|drop|3_2023-10-23T01-03-04.641003.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T01-03-04.641003.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T01_03_04.641003", "path": ["**/details_harness|gsm8k|5_2023-10-23T01-03-04.641003.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T01-03-04.641003.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:17:31.150663.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:17:31.150663.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T01_03_04.641003", "path": ["**/details_harness|winogrande|5_2023-10-23T01-03-04.641003.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T01-03-04.641003.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T16_17_31.150663", "path": ["results_2023-07-18T16:17:31.150663.parquet"]}, {"split": "2023_10_23T01_03_04.641003", "path": ["results_2023-10-23T01-03-04.641003.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T01-03-04.641003.parquet"]}]}]}
|
2023-10-23T00:03:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-13B-Uncensored-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T01:03:04.641003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-13B-Uncensored-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T01:03:04.641003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-13B-Uncensored-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T01:03:04.641003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-13B-Uncensored-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-13B-Uncensored-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T01:03:04.641003(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
cec808f41556cc0ec85d41512e1a446785a58195
|
# Dataset Card for Evaluation run of TheBloke/landmark-attention-llama7b-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/landmark-attention-llama7b-fp16](https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T21:06:08.838189](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16/blob/main/results_2023-10-22T21-06-08.838189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298539,
"f1": 0.04697252516778534,
"f1_stderr": 0.0013361369387872978,
"acc": 0.34813421471026634,
"acc_stderr": 0.008277173895027065
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298539,
"f1": 0.04697252516778534,
"f1_stderr": 0.0013361369387872978
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890015
},
"harness|winogrande|5": {
"acc": 0.6803472770323599,
"acc_stderr": 0.01310652851766513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16
|
[
"region:us"
] |
2023-08-18T10:26:08+00:00
|
{"pretty_name": "Evaluation run of TheBloke/landmark-attention-llama7b-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/landmark-attention-llama7b-fp16](https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T21:06:08.838189](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__landmark-attention-llama7b-fp16/blob/main/results_2023-10-22T21-06-08.838189.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298539,\n \"f1\": 0.04697252516778534,\n \"f1_stderr\": 0.0013361369387872978,\n \"acc\": 0.34813421471026634,\n \"acc_stderr\": 0.008277173895027065\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298539,\n \"f1\": 0.04697252516778534,\n \"f1_stderr\": 0.0013361369387872978\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \"acc_stderr\": 0.0034478192723890015\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6803472770323599,\n \"acc_stderr\": 0.01310652851766513\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/landmark-attention-llama7b-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T21_06_08.838189", "path": ["**/details_harness|drop|3_2023-10-22T21-06-08.838189.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T21-06-08.838189.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T21_06_08.838189", "path": ["**/details_harness|gsm8k|5_2023-10-22T21-06-08.838189.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T21-06-08.838189.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:07:15.770295.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:07:15.770295.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T21_06_08.838189", "path": ["**/details_harness|winogrande|5_2023-10-22T21-06-08.838189.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T21-06-08.838189.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T15_07_15.770295", "path": ["results_2023-07-31T15:07:15.770295.parquet"]}, {"split": "2023_10_22T21_06_08.838189", "path": ["results_2023-10-22T21-06-08.838189.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T21-06-08.838189.parquet"]}]}]}
|
2023-10-22T20:06:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/landmark-attention-llama7b-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/landmark-attention-llama7b-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T21:06:08.838189(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/landmark-attention-llama7b-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/landmark-attention-llama7b-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T21:06:08.838189(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/landmark-attention-llama7b-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/landmark-attention-llama7b-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T21:06:08.838189(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/landmark-attention-llama7b-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/landmark-attention-llama7b-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T21:06:08.838189(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
cf870dda2481d17d86cc58cf228c6da7869c6907
|
# Dataset Card for Evaluation run of TheBloke/WizardLM-30B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/WizardLM-30B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-30B-fp16](https://huggingface.co/TheBloke/WizardLM-30B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__WizardLM-30B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T00:26:26.066701](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-30B-fp16/blob/main/results_2023-10-23T00-26-26.066701.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2868078859060403,
"em_stderr": 0.004631679094136414,
"f1": 0.36250838926174567,
"f1_stderr": 0.004522951158382507,
"acc": 0.49859858913469757,
"acc_stderr": 0.011592515233281033
},
"harness|drop|3": {
"em": 0.2868078859060403,
"em_stderr": 0.004631679094136414,
"f1": 0.36250838926174567,
"f1_stderr": 0.004522951158382507
},
"harness|gsm8k|5": {
"acc": 0.2221379833206975,
"acc_stderr": 0.011449986902435323
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.011735043564126742
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__WizardLM-30B-fp16
|
[
"region:us"
] |
2023-08-18T10:26:17+00:00
|
{"pretty_name": "Evaluation run of TheBloke/WizardLM-30B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/WizardLM-30B-fp16](https://huggingface.co/TheBloke/WizardLM-30B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__WizardLM-30B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T00:26:26.066701](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__WizardLM-30B-fp16/blob/main/results_2023-10-23T00-26-26.066701.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2868078859060403,\n \"em_stderr\": 0.004631679094136414,\n \"f1\": 0.36250838926174567,\n \"f1_stderr\": 0.004522951158382507,\n \"acc\": 0.49859858913469757,\n \"acc_stderr\": 0.011592515233281033\n },\n \"harness|drop|3\": {\n \"em\": 0.2868078859060403,\n \"em_stderr\": 0.004631679094136414,\n \"f1\": 0.36250838926174567,\n \"f1_stderr\": 0.004522951158382507\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2221379833206975,\n \"acc_stderr\": 0.011449986902435323\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.011735043564126742\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/WizardLM-30B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|arc:challenge|25_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T15_13_10.027241", "path": ["**/details_harness|drop|3_2023-10-22T15-13-10.027241.parquet"]}, {"split": "2023_10_23T00_26_26.066701", "path": ["**/details_harness|drop|3_2023-10-23T00-26-26.066701.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T00-26-26.066701.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T15_13_10.027241", "path": ["**/details_harness|gsm8k|5_2023-10-22T15-13-10.027241.parquet"]}, {"split": "2023_10_23T00_26_26.066701", "path": ["**/details_harness|gsm8k|5_2023-10-23T00-26-26.066701.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T00-26-26.066701.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hellaswag|10_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T12:57:51.572522.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T12:57:51.572522.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T12:57:51.572522.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T15_13_10.027241", "path": ["**/details_harness|winogrande|5_2023-10-22T15-13-10.027241.parquet"]}, {"split": "2023_10_23T00_26_26.066701", "path": ["**/details_harness|winogrande|5_2023-10-23T00-26-26.066701.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T00-26-26.066701.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T12_57_51.572522", "path": ["results_2023-07-31T12:57:51.572522.parquet"]}, {"split": "2023_10_22T15_13_10.027241", "path": ["results_2023-10-22T15-13-10.027241.parquet"]}, {"split": "2023_10_23T00_26_26.066701", "path": ["results_2023-10-23T00-26-26.066701.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T00-26-26.066701.parquet"]}]}]}
|
2023-10-22T23:26:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/WizardLM-30B-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/WizardLM-30B-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T00:26:26.066701(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/WizardLM-30B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/WizardLM-30B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T00:26:26.066701(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/WizardLM-30B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/WizardLM-30B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T00:26:26.066701(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/WizardLM-30B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/WizardLM-30B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T00:26:26.066701(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b5da211a3da5ec8e151672f85037a8e91037f213
|
# Dataset Card for Evaluation run of TheBloke/airoboros-13B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/airoboros-13B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/airoboros-13B-HF](https://huggingface.co/TheBloke/airoboros-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__airoboros-13B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T02:12:37.195873](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-13B-HF/blob/main/results_2023-10-23T02-12-37.195873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779522,
"f1": 0.18403838087248262,
"f1_stderr": 0.003410322751505753,
"acc": 0.416848524958218,
"acc_stderr": 0.009523880516878821
},
"harness|drop|3": {
"em": 0.11115771812080537,
"em_stderr": 0.00321900621779522,
"f1": 0.18403838087248262,
"f1_stderr": 0.003410322751505753
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954497
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803145
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__airoboros-13B-HF
|
[
"region:us"
] |
2023-08-18T10:26:26+00:00
|
{"pretty_name": "Evaluation run of TheBloke/airoboros-13B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/airoboros-13B-HF](https://huggingface.co/TheBloke/airoboros-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__airoboros-13B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T02:12:37.195873](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__airoboros-13B-HF/blob/main/results_2023-10-23T02-12-37.195873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.11115771812080537,\n \"em_stderr\": 0.00321900621779522,\n \"f1\": 0.18403838087248262,\n \"f1_stderr\": 0.003410322751505753,\n \"acc\": 0.416848524958218,\n \"acc_stderr\": 0.009523880516878821\n },\n \"harness|drop|3\": {\n \"em\": 0.11115771812080537,\n \"em_stderr\": 0.00321900621779522,\n \"f1\": 0.18403838087248262,\n \"f1_stderr\": 0.003410322751505753\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.007086462127954497\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803145\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/airoboros-13B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T02_12_37.195873", "path": ["**/details_harness|drop|3_2023-10-23T02-12-37.195873.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T02-12-37.195873.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T02_12_37.195873", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-12-37.195873.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T02-12-37.195873.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:45.973556.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:05:45.973556.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T02_12_37.195873", "path": ["**/details_harness|winogrande|5_2023-10-23T02-12-37.195873.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T02-12-37.195873.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_05_45.973556", "path": ["results_2023-07-19T19:05:45.973556.parquet"]}, {"split": "2023_10_23T02_12_37.195873", "path": ["results_2023-10-23T02-12-37.195873.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T02-12-37.195873.parquet"]}]}]}
|
2023-10-23T01:12:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/airoboros-13B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/airoboros-13B-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T02:12:37.195873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/airoboros-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/airoboros-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:12:37.195873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/airoboros-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/airoboros-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T02:12:37.195873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/airoboros-13B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/airoboros-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T02:12:37.195873(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8b01a76c170770c645cab608aba074b607941e7c
|
# Dataset Card for Evaluation run of TheBloke/Vicuna-13B-CoT-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Vicuna-13B-CoT-fp16](https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T14:12:38.922029](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16/blob/main/results_2023-10-22T14-12-38.922029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146,
"acc": 0.4141695683211732,
"acc_stderr": 0.010019161585538096
},
"harness|drop|3": {
"em": 0.029677013422818792,
"em_stderr": 0.0017378324714143493,
"f1": 0.09310612416107406,
"f1_stderr": 0.002167792401176146
},
"harness|gsm8k|5": {
"acc": 0.08642911296436695,
"acc_stderr": 0.00774004433710381
},
"harness|winogrande|5": {
"acc": 0.7419100236779794,
"acc_stderr": 0.012298278833972384
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16
|
[
"region:us"
] |
2023-08-18T10:26:35+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Vicuna-13B-CoT-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Vicuna-13B-CoT-fp16](https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T14:12:38.922029](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Vicuna-13B-CoT-fp16/blob/main/results_2023-10-22T14-12-38.922029.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146,\n \"acc\": 0.4141695683211732,\n \"acc_stderr\": 0.010019161585538096\n },\n \"harness|drop|3\": {\n \"em\": 0.029677013422818792,\n \"em_stderr\": 0.0017378324714143493,\n \"f1\": 0.09310612416107406,\n \"f1_stderr\": 0.002167792401176146\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08642911296436695,\n \"acc_stderr\": 0.00774004433710381\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7419100236779794,\n \"acc_stderr\": 0.012298278833972384\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Vicuna-13B-CoT-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T14_12_38.922029", "path": ["**/details_harness|drop|3_2023-10-22T14-12-38.922029.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T14-12-38.922029.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T14_12_38.922029", "path": ["**/details_harness|gsm8k|5_2023-10-22T14-12-38.922029.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T14-12-38.922029.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:25:40.141748.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T15:25:40.141748.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T14_12_38.922029", "path": ["**/details_harness|winogrande|5_2023-10-22T14-12-38.922029.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T14-12-38.922029.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T15_25_40.141748", "path": ["results_2023-07-31T15:25:40.141748.parquet"]}, {"split": "2023_10_22T14_12_38.922029", "path": ["results_2023-10-22T14-12-38.922029.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T14-12-38.922029.parquet"]}]}]}
|
2023-10-22T13:12:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Vicuna-13B-CoT-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Vicuna-13B-CoT-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T14:12:38.922029(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Vicuna-13B-CoT-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Vicuna-13B-CoT-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T14:12:38.922029(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Vicuna-13B-CoT-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Vicuna-13B-CoT-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T14:12:38.922029(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Vicuna-13B-CoT-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Vicuna-13B-CoT-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T14:12:38.922029(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c1a30df26198f8b3f551dc442f418d2374763176
|
# Dataset Card for Evaluation run of TheBloke/Llama-2-70B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-70B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-70B-fp16](https://huggingface.co/TheBloke/Llama-2-70B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T03:18:37.286787](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16/blob/main/results_2023-10-23T03-18-37.286787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.06615562080536916,
"f1_stderr": 0.0013739852117668813,
"acc": 0.5885312292623206,
"acc_stderr": 0.011707750309504293
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.06615562080536916,
"f1_stderr": 0.0013739852117668813
},
"harness|gsm8k|5": {
"acc": 0.33965125094768767,
"acc_stderr": 0.01304504506766526
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343326
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16
|
[
"region:us"
] |
2023-08-18T10:26:43+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Llama-2-70B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-70B-fp16](https://huggingface.co/TheBloke/Llama-2-70B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T03:18:37.286787](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-70B-fp16/blob/main/results_2023-10-23T03-18-37.286787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460388544,\n \"f1\": 0.06615562080536916,\n \"f1_stderr\": 0.0013739852117668813,\n \"acc\": 0.5885312292623206,\n \"acc_stderr\": 0.011707750309504293\n },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460388544,\n \"f1\": 0.06615562080536916,\n \"f1_stderr\": 0.0013739852117668813\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.33965125094768767,\n \"acc_stderr\": 0.01304504506766526\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343326\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Llama-2-70B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T03_18_37.286787", "path": ["**/details_harness|drop|3_2023-10-23T03-18-37.286787.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T03-18-37.286787.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T03_18_37.286787", "path": ["**/details_harness|gsm8k|5_2023-10-23T03-18-37.286787.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T03-18-37.286787.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:40:00.231770.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T16:40:00.231770.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T03_18_37.286787", "path": ["**/details_harness|winogrande|5_2023-10-23T03-18-37.286787.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T03-18-37.286787.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T16_40_00.231770", "path": ["results_2023-07-31T16:40:00.231770.parquet"]}, {"split": "2023_10_23T03_18_37.286787", "path": ["results_2023-10-23T03-18-37.286787.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T03-18-37.286787.parquet"]}]}]}
|
2023-10-23T02:18:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Llama-2-70B-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Llama-2-70B-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T03:18:37.286787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Llama-2-70B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Llama-2-70B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T03:18:37.286787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Llama-2-70B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Llama-2-70B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T03:18:37.286787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Llama-2-70B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Llama-2-70B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T03:18:37.286787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
62a65f697e613b952cba753bb86248ae76c38bb2
|
# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Llama-2-13B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-13B-fp16](https://huggingface.co/TheBloke/Llama-2-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T22:53:07.629534](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16/blob/main/results_2023-10-22T22-53-07.629534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982666,
"f1": 0.0607822986577181,
"f1_stderr": 0.0013583957676382913,
"acc": 0.43739636770101,
"acc_stderr": 0.010228023491905505
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982666,
"f1": 0.0607822986577181,
"f1_stderr": 0.0013583957676382913
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.008563852506627487
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16
|
[
"region:us"
] |
2023-08-18T10:26:52+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Llama-2-13B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Llama-2-13B-fp16](https://huggingface.co/TheBloke/Llama-2-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T22:53:07.629534](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Llama-2-13B-fp16/blob/main/results_2023-10-22T22-53-07.629534.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982666,\n \"f1\": 0.0607822986577181,\n \"f1_stderr\": 0.0013583957676382913,\n \"acc\": 0.43739636770101,\n \"acc_stderr\": 0.010228023491905505\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982666,\n \"f1\": 0.0607822986577181,\n \"f1_stderr\": 0.0013583957676382913\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \"acc_stderr\": 0.008563852506627487\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Llama-2-13B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T22_53_07.629534", "path": ["**/details_harness|drop|3_2023-10-22T22-53-07.629534.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T22-53-07.629534.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T22_53_07.629534", "path": ["**/details_harness|gsm8k|5_2023-10-22T22-53-07.629534.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T22-53-07.629534.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:08:39.202746.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T15:08:39.202746.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T22_53_07.629534", "path": ["**/details_harness|winogrande|5_2023-10-22T22-53-07.629534.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T22-53-07.629534.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T15_08_39.202746", "path": ["results_2023-07-24T15:08:39.202746.parquet"]}, {"split": "2023_10_22T22_53_07.629534", "path": ["results_2023-10-22T22-53-07.629534.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T22-53-07.629534.parquet"]}]}]}
|
2023-10-22T21:53:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Llama-2-13B-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T22:53:07.629534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Llama-2-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T22:53:07.629534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Llama-2-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T22:53:07.629534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Llama-2-13B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Llama-2-13B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T22:53:07.629534(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e17b2977679d6ab3f799929964d5f111d72afb0c
|
# Dataset Card for Evaluation run of TheBloke/koala-7B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/koala-7B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/koala-7B-HF](https://huggingface.co/TheBloke/koala-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__koala-7B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T01:40:19.739323](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-7B-HF/blob/main/results_2023-10-22T01-40-19.739323.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15855704697986578,
"em_stderr": 0.003740630102537935,
"f1": 0.21851510067114052,
"f1_stderr": 0.0038089998736125477,
"acc": 0.36784043303715414,
"acc_stderr": 0.009023061991967956
},
"harness|drop|3": {
"em": 0.15855704697986578,
"em_stderr": 0.003740630102537935,
"f1": 0.21851510067114052,
"f1_stderr": 0.0038089998736125477
},
"harness|gsm8k|5": {
"acc": 0.03639120545868082,
"acc_stderr": 0.005158113489231195
},
"harness|winogrande|5": {
"acc": 0.6992896606156275,
"acc_stderr": 0.012888010494704718
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__koala-7B-HF
|
[
"region:us"
] |
2023-08-18T10:27:01+00:00
|
{"pretty_name": "Evaluation run of TheBloke/koala-7B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/koala-7B-HF](https://huggingface.co/TheBloke/koala-7B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__koala-7B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T01:40:19.739323](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__koala-7B-HF/blob/main/results_2023-10-22T01-40-19.739323.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15855704697986578,\n \"em_stderr\": 0.003740630102537935,\n \"f1\": 0.21851510067114052,\n \"f1_stderr\": 0.0038089998736125477,\n \"acc\": 0.36784043303715414,\n \"acc_stderr\": 0.009023061991967956\n },\n \"harness|drop|3\": {\n \"em\": 0.15855704697986578,\n \"em_stderr\": 0.003740630102537935,\n \"f1\": 0.21851510067114052,\n \"f1_stderr\": 0.0038089998736125477\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.03639120545868082,\n \"acc_stderr\": 0.005158113489231195\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6992896606156275,\n \"acc_stderr\": 0.012888010494704718\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/koala-7B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T01_40_19.739323", "path": ["**/details_harness|drop|3_2023-10-22T01-40-19.739323.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T01-40-19.739323.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T01_40_19.739323", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-40-19.739323.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-40-19.739323.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:17:07.046452.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:17:07.046452.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T01_40_19.739323", "path": ["**/details_harness|winogrande|5_2023-10-22T01-40-19.739323.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T01-40-19.739323.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_17_07.046452", "path": ["results_2023-07-19T17:17:07.046452.parquet"]}, {"split": "2023_10_22T01_40_19.739323", "path": ["results_2023-10-22T01-40-19.739323.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T01-40-19.739323.parquet"]}]}]}
|
2023-10-22T00:40:32+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/koala-7B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/koala-7B-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T01:40:19.739323(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/koala-7B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/koala-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T01:40:19.739323(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/koala-7B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/koala-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T01:40:19.739323(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/koala-7B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/koala-7B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T01:40:19.739323(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c6e3ac8f481c6fed21726afb3792789cdcdf6294
|
# Dataset Card for Evaluation run of TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16](https://huggingface.co/TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__VicUnlocked-alpaca-65B-QLoRA-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T05:26:06.926177](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__VicUnlocked-alpaca-65B-QLoRA-fp16/blob/main/results_2023-10-23T05-26-06.926177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002202181208053691,
"em_stderr": 0.0004800510816619414,
"f1": 0.07708053691275155,
"f1_stderr": 0.0014808362697713243,
"acc": 0.5455925269256983,
"acc_stderr": 0.011651760053332794
},
"harness|drop|3": {
"em": 0.002202181208053691,
"em_stderr": 0.0004800510816619414,
"f1": 0.07708053691275155,
"f1_stderr": 0.0014808362697713243
},
"harness|gsm8k|5": {
"acc": 0.27824109173616374,
"acc_stderr": 0.012343803671422678
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.01095971643524291
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__VicUnlocked-alpaca-65B-QLoRA-fp16
|
[
"region:us"
] |
2023-08-18T10:27:10+00:00
|
{"pretty_name": "Evaluation run of TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16](https://huggingface.co/TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__VicUnlocked-alpaca-65B-QLoRA-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T05:26:06.926177](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__VicUnlocked-alpaca-65B-QLoRA-fp16/blob/main/results_2023-10-23T05-26-06.926177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.0004800510816619414,\n \"f1\": 0.07708053691275155,\n \"f1_stderr\": 0.0014808362697713243,\n \"acc\": 0.5455925269256983,\n \"acc_stderr\": 0.011651760053332794\n },\n \"harness|drop|3\": {\n \"em\": 0.002202181208053691,\n \"em_stderr\": 0.0004800510816619414,\n \"f1\": 0.07708053691275155,\n \"f1_stderr\": 0.0014808362697713243\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.27824109173616374,\n \"acc_stderr\": 0.012343803671422678\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.01095971643524291\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T05_26_06.926177", "path": ["**/details_harness|drop|3_2023-10-23T05-26-06.926177.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T05-26-06.926177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T05_26_06.926177", "path": ["**/details_harness|gsm8k|5_2023-10-23T05-26-06.926177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T05-26-06.926177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:42:29.328886.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:42:29.328886.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:42:29.328886.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T05_26_06.926177", "path": ["**/details_harness|winogrande|5_2023-10-23T05-26-06.926177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T05-26-06.926177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T19_42_29.328886", "path": ["results_2023-07-25T19:42:29.328886.parquet"]}, {"split": "2023_10_23T05_26_06.926177", "path": ["results_2023-10-23T05-26-06.926177.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T05-26-06.926177.parquet"]}]}]}
|
2023-10-23T04:26:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T05:26:06.926177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T05:26:06.926177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T05:26:06.926177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
33,
31,
181,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/VicUnlocked-alpaca-65B-QLoRA-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T05:26:06.926177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
dc429005e585be3d2c54967ab86baa8ccab9144f
|
# Dataset Card for Evaluation run of TheBloke/wizard-vicuna-13B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/wizard-vicuna-13B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/wizard-vicuna-13B-HF](https://huggingface.co/TheBloke/wizard-vicuna-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__wizard-vicuna-13B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T05:16:09.820423](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizard-vicuna-13B-HF/blob/main/results_2023-10-22T05-16-09.820423.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0350251677852349,
"em_stderr": 0.0018827287598880416,
"f1": 0.10088821308724859,
"f1_stderr": 0.0023095858218995214,
"acc": 0.4207383077634691,
"acc_stderr": 0.010104088969294184
},
"harness|drop|3": {
"em": 0.0350251677852349,
"em_stderr": 0.0018827287598880416,
"f1": 0.10088821308724859,
"f1_stderr": 0.0023095858218995214
},
"harness|gsm8k|5": {
"acc": 0.0932524639878696,
"acc_stderr": 0.008009688838328585
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259785
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__wizard-vicuna-13B-HF
|
[
"region:us"
] |
2023-08-18T10:27:19+00:00
|
{"pretty_name": "Evaluation run of TheBloke/wizard-vicuna-13B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/wizard-vicuna-13B-HF](https://huggingface.co/TheBloke/wizard-vicuna-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__wizard-vicuna-13B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T05:16:09.820423](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__wizard-vicuna-13B-HF/blob/main/results_2023-10-22T05-16-09.820423.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0350251677852349,\n \"em_stderr\": 0.0018827287598880416,\n \"f1\": 0.10088821308724859,\n \"f1_stderr\": 0.0023095858218995214,\n \"acc\": 0.4207383077634691,\n \"acc_stderr\": 0.010104088969294184\n },\n \"harness|drop|3\": {\n \"em\": 0.0350251677852349,\n \"em_stderr\": 0.0018827287598880416,\n \"f1\": 0.10088821308724859,\n \"f1_stderr\": 0.0023095858218995214\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0932524639878696,\n \"acc_stderr\": 0.008009688838328585\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259785\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/wizard-vicuna-13B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|arc:challenge|25_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T05_16_09.820423", "path": ["**/details_harness|drop|3_2023-10-22T05-16-09.820423.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T05-16-09.820423.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T05_16_09.820423", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-16-09.820423.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-16-09.820423.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hellaswag|10_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T15:41:31.806863.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T15:41:31.806863.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T15:41:31.806863.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T05_16_09.820423", "path": ["**/details_harness|winogrande|5_2023-10-22T05-16-09.820423.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T05-16-09.820423.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T15_41_31.806863", "path": ["results_2023-07-18T15:41:31.806863.parquet"]}, {"split": "2023_10_22T05_16_09.820423", "path": ["results_2023-10-22T05-16-09.820423.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T05-16-09.820423.parquet"]}]}]}
|
2023-10-22T04:16:22+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/wizard-vicuna-13B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/wizard-vicuna-13B-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T05:16:09.820423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/wizard-vicuna-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizard-vicuna-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:16:09.820423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/wizard-vicuna-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizard-vicuna-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:16:09.820423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/wizard-vicuna-13B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/wizard-vicuna-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T05:16:09.820423(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d04c78e724691c1e686dba9ba09eb17b12d5a527
|
# Dataset Card for Evaluation run of TheBloke/guanaco-65B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/guanaco-65B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/guanaco-65B-HF](https://huggingface.co/TheBloke/guanaco-65B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__guanaco-65B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T03:09:40.214751](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__guanaco-65B-HF/blob/main/results_2023-10-23T03-09-40.214751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666983,
"f1": 0.06694840604026871,
"f1_stderr": 0.0014210409267209844,
"acc": 0.5420195874394811,
"acc_stderr": 0.011392971611327397
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666983,
"f1": 0.06694840604026871,
"f1_stderr": 0.0014210409267209844
},
"harness|gsm8k|5": {
"acc": 0.26004548900682334,
"acc_stderr": 0.012082852340334089
},
"harness|winogrande|5": {
"acc": 0.823993685872139,
"acc_stderr": 0.010703090882320705
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__guanaco-65B-HF
|
[
"region:us"
] |
2023-08-18T10:27:27+00:00
|
{"pretty_name": "Evaluation run of TheBloke/guanaco-65B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/guanaco-65B-HF](https://huggingface.co/TheBloke/guanaco-65B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__guanaco-65B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T03:09:40.214751](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__guanaco-65B-HF/blob/main/results_2023-10-23T03-09-40.214751.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666983,\n \"f1\": 0.06694840604026871,\n \"f1_stderr\": 0.0014210409267209844,\n \"acc\": 0.5420195874394811,\n \"acc_stderr\": 0.011392971611327397\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666983,\n \"f1\": 0.06694840604026871,\n \"f1_stderr\": 0.0014210409267209844\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.26004548900682334,\n \"acc_stderr\": 0.012082852340334089\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.823993685872139,\n \"acc_stderr\": 0.010703090882320705\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/guanaco-65B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T03_09_40.214751", "path": ["**/details_harness|drop|3_2023-10-23T03-09-40.214751.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T03-09-40.214751.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T03_09_40.214751", "path": ["**/details_harness|gsm8k|5_2023-10-23T03-09-40.214751.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T03-09-40.214751.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T19:41:45.375855.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:41:45.375855.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T19:41:45.375855.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T03_09_40.214751", "path": ["**/details_harness|winogrande|5_2023-10-23T03-09-40.214751.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T03-09-40.214751.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_25T19_41_45.375855", "path": ["results_2023-07-25T19:41:45.375855.parquet"]}, {"split": "2023_10_23T03_09_40.214751", "path": ["results_2023-10-23T03-09-40.214751.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T03-09-40.214751.parquet"]}]}]}
|
2023-10-23T02:09:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/guanaco-65B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/guanaco-65B-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T03:09:40.214751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/guanaco-65B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T03:09:40.214751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/guanaco-65B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T03:09:40.214751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/guanaco-65B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/guanaco-65B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T03:09:40.214751(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
6466acb96dcc6d2f440b96d78ed87e797de1f534
|
# Dataset of star_sapphire/スターサファイア/스타사파이어 (Touhou)
This is the dataset of star_sapphire/スターサファイア/스타사파이어 (Touhou), containing 137 images and their tags.
The core tags of this character are `long_hair, bow, hair_bow, wings, brown_eyes, brown_hair, fairy_wings, black_hair, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 137 | 129.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/star_sapphire_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 137 | 94.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/star_sapphire_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 257 | 165.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/star_sapphire_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 137 | 122.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/star_sapphire_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 257 | 204.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/star_sapphire_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/star_sapphire_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 35 |  |  |  |  |  | 1girl, solo, dress, smile, star_(symbol), blush, open_mouth, yellow_eyes |
| 1 | 15 |  |  |  |  |  | 1girl, blue_dress, looking_at_viewer, solo, blunt_bangs, smile, star_(symbol), blue_bow, full_body, wide_sleeves, frills, juliet_sleeves, simple_background, white_background, fairy, blush, brown_footwear, boots |
| 2 | 14 |  |  |  |  |  | 2girls, smile, dress, one_eye_closed, open_mouth, blonde_hair, star_(symbol) |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | dress | smile | star_(symbol) | blush | open_mouth | yellow_eyes | blue_dress | looking_at_viewer | blunt_bangs | blue_bow | full_body | wide_sleeves | frills | juliet_sleeves | simple_background | white_background | fairy | brown_footwear | boots | 2girls | one_eye_closed | blonde_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------|:----------------|:--------|:-------------|:--------------|:-------------|:--------------------|:--------------|:-----------|:------------|:---------------|:---------|:-----------------|:--------------------|:-------------------|:--------|:-----------------|:--------|:---------|:-----------------|:--------------|
| 0 | 35 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 2 | 14 |  |  |  |  |  | | | X | X | X | | X | | | | | | | | | | | | | | | X | X | X |
|
CyberHarem/star_sapphire_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T10:27:29+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T22:03:27+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of star\_sapphire/スターサファイア/스타사파이어 (Touhou)
==================================================
This is the dataset of star\_sapphire/スターサファイア/스타사파이어 (Touhou), containing 137 images and their tags.
The core tags of this character are 'long\_hair, bow, hair\_bow, wings, brown\_eyes, brown\_hair, fairy\_wings, black\_hair, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
fa55907dba524503057ea2a8576a4bcef6705cca
|
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-31T18:46:06.024423](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16/blob/main/results_2023-07-31T18%3A46%3A06.024423.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23519468841762173,
"acc_stderr": 0.030867946729594396,
"acc_norm": 0.23665032922383497,
"acc_norm_stderr": 0.03088234450623421,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4747511496520905,
"mc2_stderr": 0.016743067237896876
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407312,
"acc_norm": 0.2619453924914676,
"acc_norm_stderr": 0.012849054826858115
},
"harness|hellaswag|10": {
"acc": 0.2804222266480781,
"acc_stderr": 0.004482874732237348,
"acc_norm": 0.3296156144194384,
"acc_norm_stderr": 0.004691128722535483
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.19245283018867926,
"acc_stderr": 0.024262979839372277,
"acc_norm": 0.19245283018867926,
"acc_norm_stderr": 0.024262979839372277
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135303,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135303
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.18064516129032257,
"acc_stderr": 0.02188617856717255,
"acc_norm": 0.18064516129032257,
"acc_norm_stderr": 0.02188617856717255
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.18652849740932642,
"acc_stderr": 0.028112091210117447,
"acc_norm": 0.18652849740932642,
"acc_norm_stderr": 0.028112091210117447
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722127995,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722127995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361255,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361255
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2361111111111111,
"acc_stderr": 0.02896370257079103,
"acc_norm": 0.2361111111111111,
"acc_norm_stderr": 0.02896370257079103
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598035,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598035
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431163,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431163
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.01440029642922562,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.01440029642922562
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1832797427652733,
"acc_stderr": 0.021974198848265805,
"acc_norm": 0.1832797427652733,
"acc_norm_stderr": 0.021974198848265805
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724136,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724136
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22040816326530613,
"acc_stderr": 0.02653704531214529,
"acc_norm": 0.22040816326530613,
"acc_norm_stderr": 0.02653704531214529
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.034678266857038266,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.034678266857038266
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.01467925503211107,
"mc2": 0.4747511496520905,
"mc2_stderr": 0.016743067237896876
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16
|
[
"region:us"
] |
2023-08-18T10:27:36+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16](https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-07-31T18:46:06.024423](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Wizard-Vicuna-30B-Superhot-8K-fp16/blob/main/results_2023-07-31T18%3A46%3A06.024423.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23519468841762173,\n \"acc_stderr\": 0.030867946729594396,\n \"acc_norm\": 0.23665032922383497,\n \"acc_norm_stderr\": 0.03088234450623421,\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4747511496520905,\n \"mc2_stderr\": 0.016743067237896876\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407312,\n \"acc_norm\": 0.2619453924914676,\n \"acc_norm_stderr\": 0.012849054826858115\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2804222266480781,\n \"acc_stderr\": 0.004482874732237348,\n \"acc_norm\": 0.3296156144194384,\n \"acc_norm_stderr\": 0.004691128722535483\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.19245283018867926,\n \"acc_stderr\": 0.024262979839372277,\n \"acc_norm\": 0.19245283018867926,\n \"acc_norm_stderr\": 0.024262979839372277\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135303,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135303\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.18064516129032257,\n \"acc_stderr\": 0.02188617856717255,\n \"acc_norm\": 0.18064516129032257,\n \"acc_norm_stderr\": 0.02188617856717255\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.18652849740932642,\n \"acc_stderr\": 0.028112091210117447,\n \"acc_norm\": 0.18652849740932642,\n \"acc_norm_stderr\": 0.028112091210117447\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722127995,\n \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722127995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361255,\n \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361255\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.2361111111111111,\n \"acc_stderr\": 0.02896370257079103,\n \"acc_norm\": 0.2361111111111111,\n \"acc_norm_stderr\": 0.02896370257079103\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598035,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598035\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n \"acc_stderr\": 0.029202540153431163,\n \"acc_norm\": 0.27350427350427353,\n \"acc_norm_stderr\": 0.029202540153431163\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n \"acc_stderr\": 0.01440029642922562,\n \"acc_norm\": 0.24581005586592178,\n \"acc_norm_stderr\": 0.01440029642922562\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1832797427652733,\n \"acc_stderr\": 0.021974198848265805,\n \"acc_norm\": 0.1832797427652733,\n \"acc_norm_stderr\": 0.021974198848265805\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n \"acc_stderr\": 0.03895091015724136,\n \"acc_norm\": 0.20909090909090908,\n \"acc_norm_stderr\": 0.03895091015724136\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.22040816326530613,\n \"acc_stderr\": 0.02653704531214529,\n \"acc_norm\": 0.22040816326530613,\n \"acc_norm_stderr\": 0.02653704531214529\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.034678266857038266,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.034678266857038266\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n \"mc1_stderr\": 0.01467925503211107,\n \"mc2\": 0.4747511496520905,\n \"mc2_stderr\": 0.016743067237896876\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|arc:challenge|25_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hellaswag|10_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T18:46:06.024423.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T18_46_06.024423", "path": ["results_2023-07-31T18:46:06.024423.parquet"]}, {"split": "latest", "path": ["results_2023-07-31T18:46:06.024423.parquet"]}]}]}
|
2023-08-27T11:33:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-07-31T18:46:06.024423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-31T18:46:06.024423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-07-31T18:46:06.024423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
32,
31,
180,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Wizard-Vicuna-30B-Superhot-8K-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-07-31T18:46:06.024423 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a6d5438973c7e03703489a2d167802626018ea96
|
# Dataset Card for Evaluation run of TheBloke/Planner-7B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Planner-7B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Planner-7B-fp16](https://huggingface.co/TheBloke/Planner-7B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Planner-7B-fp16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-21T22:53:17.425716](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Planner-7B-fp16/blob/main/results_2023-10-21T22-53-17.425716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428,
"acc": 0.3749593848153363,
"acc_stderr": 0.008901319861891403
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219126,
"f1": 0.056186031879194784,
"f1_stderr": 0.0012858243614759428
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.00510610785374419
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__Planner-7B-fp16
|
[
"region:us"
] |
2023-08-18T10:27:45+00:00
|
{"pretty_name": "Evaluation run of TheBloke/Planner-7B-fp16", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/Planner-7B-fp16](https://huggingface.co/TheBloke/Planner-7B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Planner-7B-fp16\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-21T22:53:17.425716](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Planner-7B-fp16/blob/main/results_2023-10-21T22-53-17.425716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219126,\n \"f1\": 0.056186031879194784,\n \"f1_stderr\": 0.0012858243614759428,\n \"acc\": 0.3749593848153363,\n \"acc_stderr\": 0.008901319861891403\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219126,\n \"f1\": 0.056186031879194784,\n \"f1_stderr\": 0.0012858243614759428\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \"acc_stderr\": 0.00510610785374419\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038616\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/Planner-7B-fp16", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_21T22_53_17.425716", "path": ["**/details_harness|drop|3_2023-10-21T22-53-17.425716.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-21T22-53-17.425716.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_21T22_53_17.425716", "path": ["**/details_harness|gsm8k|5_2023-10-21T22-53-17.425716.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-21T22-53-17.425716.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:47:15.541190.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:47:15.541190.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_21T22_53_17.425716", "path": ["**/details_harness|winogrande|5_2023-10-21T22-53-17.425716.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-21T22-53-17.425716.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_47_15.541190", "path": ["results_2023-07-19T16:47:15.541190.parquet"]}, {"split": "2023_10_21T22_53_17.425716", "path": ["results_2023-10-21T22-53-17.425716.parquet"]}, {"split": "latest", "path": ["results_2023-10-21T22-53-17.425716.parquet"]}]}]}
|
2023-10-21T21:53:30+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/Planner-7B-fp16
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/Planner-7B-fp16 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-21T22:53:17.425716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/Planner-7B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Planner-7B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T22:53:17.425716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/Planner-7B-fp16",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Planner-7B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-21T22:53:17.425716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/Planner-7B-fp16## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/Planner-7B-fp16 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-21T22:53:17.425716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
5e69864364fefe35584167564cfbf9538cf524a2
|
# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora-13B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/gpt4-alpaca-lora-13B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/gpt4-alpaca-lora-13B-HF](https://huggingface.co/TheBloke/gpt4-alpaca-lora-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora-13B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T00:28:03.157336](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora-13B-HF/blob/main/results_2023-10-23T00-28-03.157336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0041946308724832215,
"em_stderr": 0.0006618716168266549,
"f1": 0.06315121644295306,
"f1_stderr": 0.0014384546797583987,
"acc": 0.4290722743845191,
"acc_stderr": 0.009899761958935093
},
"harness|drop|3": {
"em": 0.0041946308724832215,
"em_stderr": 0.0006618716168266549,
"f1": 0.06315121644295306,
"f1_stderr": 0.0014384546797583987
},
"harness|gsm8k|5": {
"acc": 0.09097801364670205,
"acc_stderr": 0.007921322844013642
},
"harness|winogrande|5": {
"acc": 0.7671665351223362,
"acc_stderr": 0.011878201073856544
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora-13B-HF
|
[
"region:us"
] |
2023-08-18T10:27:53+00:00
|
{"pretty_name": "Evaluation run of TheBloke/gpt4-alpaca-lora-13B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/gpt4-alpaca-lora-13B-HF](https://huggingface.co/TheBloke/gpt4-alpaca-lora-13B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora-13B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-23T00:28:03.157336](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__gpt4-alpaca-lora-13B-HF/blob/main/results_2023-10-23T00-28-03.157336.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0041946308724832215,\n \"em_stderr\": 0.0006618716168266549,\n \"f1\": 0.06315121644295306,\n \"f1_stderr\": 0.0014384546797583987,\n \"acc\": 0.4290722743845191,\n \"acc_stderr\": 0.009899761958935093\n },\n \"harness|drop|3\": {\n \"em\": 0.0041946308724832215,\n \"em_stderr\": 0.0006618716168266549,\n \"f1\": 0.06315121644295306,\n \"f1_stderr\": 0.0014384546797583987\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09097801364670205,\n \"acc_stderr\": 0.007921322844013642\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7671665351223362,\n \"acc_stderr\": 0.011878201073856544\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/gpt4-alpaca-lora-13B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T00_28_03.157336", "path": ["**/details_harness|drop|3_2023-10-23T00-28-03.157336.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-23T00-28-03.157336.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T00_28_03.157336", "path": ["**/details_harness|gsm8k|5_2023-10-23T00-28-03.157336.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-23T00-28-03.157336.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:32:00.745427.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:32:00.745427.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:32:00.745427.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T00_28_03.157336", "path": ["**/details_harness|winogrande|5_2023-10-23T00-28-03.157336.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-23T00-28-03.157336.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_32_00.745427", "path": ["results_2023-07-19T19:32:00.745427.parquet"]}, {"split": "2023_10_23T00_28_03.157336", "path": ["results_2023-10-23T00-28-03.157336.parquet"]}, {"split": "latest", "path": ["results_2023-10-23T00-28-03.157336.parquet"]}]}]}
|
2023-10-22T23:28:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora-13B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora-13B-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-23T00:28:03.157336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T00:28:03.157336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora-13B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-23T00:28:03.157336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/gpt4-alpaca-lora-13B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/gpt4-alpaca-lora-13B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-23T00:28:03.157336(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
6b61e1edea20f7f1e7f6e45e29893c9faf0cb1c7
|
# Dataset Card for Evaluation run of TheBloke/OpenAssistant-SFT-7-Llama-30B-HF
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/OpenAssistant-SFT-7-Llama-30B-HF
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/OpenAssistant-SFT-7-Llama-30B-HF](https://huggingface.co/TheBloke/OpenAssistant-SFT-7-Llama-30B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__OpenAssistant-SFT-7-Llama-30B-HF",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T12:34:46.585647](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__OpenAssistant-SFT-7-Llama-30B-HF/blob/main/results_2023-10-18T12-34-46.585647.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.30463506711409394,
"em_stderr": 0.004713418382367835,
"f1": 0.3681375838926183,
"f1_stderr": 0.0046109589189275765,
"acc": 0.5420309566992765,
"acc_stderr": 0.012061199593502377
},
"harness|drop|3": {
"em": 0.30463506711409394,
"em_stderr": 0.004713418382367835,
"f1": 0.3681375838926183,
"f1_stderr": 0.0046109589189275765
},
"harness|gsm8k|5": {
"acc": 0.2979529946929492,
"acc_stderr": 0.012597932232914508
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090248
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TheBloke__OpenAssistant-SFT-7-Llama-30B-HF
|
[
"region:us"
] |
2023-08-18T10:28:02+00:00
|
{"pretty_name": "Evaluation run of TheBloke/OpenAssistant-SFT-7-Llama-30B-HF", "dataset_summary": "Dataset automatically created during the evaluation run of model [TheBloke/OpenAssistant-SFT-7-Llama-30B-HF](https://huggingface.co/TheBloke/OpenAssistant-SFT-7-Llama-30B-HF) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__OpenAssistant-SFT-7-Llama-30B-HF\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T12:34:46.585647](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__OpenAssistant-SFT-7-Llama-30B-HF/blob/main/results_2023-10-18T12-34-46.585647.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.30463506711409394,\n \"em_stderr\": 0.004713418382367835,\n \"f1\": 0.3681375838926183,\n \"f1_stderr\": 0.0046109589189275765,\n \"acc\": 0.5420309566992765,\n \"acc_stderr\": 0.012061199593502377\n },\n \"harness|drop|3\": {\n \"em\": 0.30463506711409394,\n \"em_stderr\": 0.004713418382367835,\n \"f1\": 0.3681375838926183,\n \"f1_stderr\": 0.0046109589189275765\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2979529946929492,\n \"acc_stderr\": 0.012597932232914508\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090248\n }\n}\n```", "repo_url": "https://huggingface.co/TheBloke/OpenAssistant-SFT-7-Llama-30B-HF", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T12_34_46.585647", "path": ["**/details_harness|drop|3_2023-10-18T12-34-46.585647.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T12-34-46.585647.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T12_34_46.585647", "path": ["**/details_harness|gsm8k|5_2023-10-18T12-34-46.585647.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T12-34-46.585647.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:44:19.720986.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:44:19.720986.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:44:19.720986.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T12_34_46.585647", "path": ["**/details_harness|winogrande|5_2023-10-18T12-34-46.585647.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T12-34-46.585647.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_44_19.720986", "path": ["results_2023-07-19T22:44:19.720986.parquet"]}, {"split": "2023_10_18T12_34_46.585647", "path": ["results_2023-10-18T12-34-46.585647.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T12-34-46.585647.parquet"]}]}]}
|
2023-10-18T11:35:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TheBloke/OpenAssistant-SFT-7-Llama-30B-HF
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TheBloke/OpenAssistant-SFT-7-Llama-30B-HF on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T12:34:46.585647(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TheBloke/OpenAssistant-SFT-7-Llama-30B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/OpenAssistant-SFT-7-Llama-30B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T12:34:46.585647(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TheBloke/OpenAssistant-SFT-7-Llama-30B-HF",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/OpenAssistant-SFT-7-Llama-30B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T12:34:46.585647(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TheBloke/OpenAssistant-SFT-7-Llama-30B-HF## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TheBloke/OpenAssistant-SFT-7-Llama-30B-HF on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T12:34:46.585647(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.