sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
fc8e1ba08ac3e94bd7503f84033ab0d816efc1e8
|
# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-6b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/PPO_Shygmalion-6b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/PPO_Shygmalion-6b](https://huggingface.co/TehVenom/PPO_Shygmalion-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T15:36:21.377959](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b/blob/main/results_2023-10-18T15-36-21.377959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219229,
"f1": 0.051307676174496844,
"f1_stderr": 0.001242463870785362,
"acc": 0.3358539181760356,
"acc_stderr": 0.008527692652879759
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219229,
"f1": 0.051307676174496844,
"f1_stderr": 0.001242463870785362
},
"harness|gsm8k|5": {
"acc": 0.01819560272934041,
"acc_stderr": 0.003681611894073874
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685644
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b
|
[
"region:us"
] |
2023-08-17T23:11:31+00:00
|
{"pretty_name": "Evaluation run of TehVenom/PPO_Shygmalion-6b", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/PPO_Shygmalion-6b](https://huggingface.co/TehVenom/PPO_Shygmalion-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T15:36:21.377959](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Shygmalion-6b/blob/main/results_2023-10-18T15-36-21.377959.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219229,\n \"f1\": 0.051307676174496844,\n \"f1_stderr\": 0.001242463870785362,\n \"acc\": 0.3358539181760356,\n \"acc_stderr\": 0.008527692652879759\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219229,\n \"f1\": 0.051307676174496844,\n \"f1_stderr\": 0.001242463870785362\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01819560272934041,\n \"acc_stderr\": 0.003681611894073874\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685644\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/PPO_Shygmalion-6b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T15_36_21.377959", "path": ["**/details_harness|drop|3_2023-10-18T15-36-21.377959.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T15-36-21.377959.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T15_36_21.377959", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-36-21.377959.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-36-21.377959.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:01:34.013898.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:01:34.013898.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T15_36_21.377959", "path": ["**/details_harness|winogrande|5_2023-10-18T15-36-21.377959.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T15-36-21.377959.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_01_34.013898", "path": ["results_2023-07-19T16:01:34.013898.parquet"]}, {"split": "2023_10_18T15_36_21.377959", "path": ["results_2023-10-18T15-36-21.377959.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T15-36-21.377959.parquet"]}]}]}
|
2023-10-18T14:37:15+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-6b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-6b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T15:36:21.377959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T15:36:21.377959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T15:36:21.377959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/PPO_Shygmalion-6b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Shygmalion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T15:36:21.377959(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a8687c7bce5dbb3889001e1fa3445cfec75c15aa
|
# Dataset Card for Evaluation run of TehVenom/Metharme-13b-Merged
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/Metharme-13b-Merged
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/Metharme-13b-Merged](https://huggingface.co/TehVenom/Metharme-13b-Merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__Metharme-13b-Merged",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T04:29:35.620999](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Metharme-13b-Merged/blob/main/results_2023-10-22T04-29-35.620999.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.14828020134228187,
"em_stderr": 0.003639398453670487,
"f1": 0.20397126677852254,
"f1_stderr": 0.0037224342383047814,
"acc": 0.4275715320915309,
"acc_stderr": 0.009817420554305734
},
"harness|drop|3": {
"em": 0.14828020134228187,
"em_stderr": 0.003639398453670487,
"f1": 0.20397126677852254,
"f1_stderr": 0.0037224342383047814
},
"harness|gsm8k|5": {
"acc": 0.08718726307808947,
"acc_stderr": 0.0077706914167835345
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827933
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TehVenom__Metharme-13b-Merged
|
[
"region:us"
] |
2023-08-17T23:11:40+00:00
|
{"pretty_name": "Evaluation run of TehVenom/Metharme-13b-Merged", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/Metharme-13b-Merged](https://huggingface.co/TehVenom/Metharme-13b-Merged) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__Metharme-13b-Merged\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T04:29:35.620999](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Metharme-13b-Merged/blob/main/results_2023-10-22T04-29-35.620999.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.14828020134228187,\n \"em_stderr\": 0.003639398453670487,\n \"f1\": 0.20397126677852254,\n \"f1_stderr\": 0.0037224342383047814,\n \"acc\": 0.4275715320915309,\n \"acc_stderr\": 0.009817420554305734\n },\n \"harness|drop|3\": {\n \"em\": 0.14828020134228187,\n \"em_stderr\": 0.003639398453670487,\n \"f1\": 0.20397126677852254,\n \"f1_stderr\": 0.0037224342383047814\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08718726307808947,\n \"acc_stderr\": 0.0077706914167835345\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827933\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/Metharme-13b-Merged", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T04_29_35.620999", "path": ["**/details_harness|drop|3_2023-10-22T04-29-35.620999.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T04-29-35.620999.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T04_29_35.620999", "path": ["**/details_harness|gsm8k|5_2023-10-22T04-29-35.620999.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T04-29-35.620999.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:38:16.849457.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:38:16.849457.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:38:16.849457.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T04_29_35.620999", "path": ["**/details_harness|winogrande|5_2023-10-22T04-29-35.620999.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T04-29-35.620999.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_38_16.849457", "path": ["results_2023-07-19T18:38:16.849457.parquet"]}, {"split": "2023_10_22T04_29_35.620999", "path": ["results_2023-10-22T04-29-35.620999.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T04-29-35.620999.parquet"]}]}]}
|
2023-10-22T03:29:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TehVenom/Metharme-13b-Merged
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TehVenom/Metharme-13b-Merged on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T04:29:35.620999(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TehVenom/Metharme-13b-Merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Metharme-13b-Merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T04:29:35.620999(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TehVenom/Metharme-13b-Merged",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Metharme-13b-Merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T04:29:35.620999(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/Metharme-13b-Merged## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Metharme-13b-Merged on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T04:29:35.620999(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
229054dfca0f5171d4af324b7607e1a8d71bceae
|
# Dataset Card for Evaluation run of TehVenom/PPO_Pygway-V8p4_Dev-6b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/PPO_Pygway-V8p4_Dev-6b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/PPO_Pygway-V8p4_Dev-6b](https://huggingface.co/TehVenom/PPO_Pygway-V8p4_Dev-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__PPO_Pygway-V8p4_Dev-6b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T01:37:21.951842](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Pygway-V8p4_Dev-6b/blob/main/results_2023-10-17T01-37-21.951842.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606367,
"f1": 0.05445260067114113,
"f1_stderr": 0.0012869911657351386,
"acc": 0.3352881479056926,
"acc_stderr": 0.008941893321769558
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606367,
"f1": 0.05445260067114113,
"f1_stderr": 0.0012869911657351386
},
"harness|gsm8k|5": {
"acc": 0.026535253980288095,
"acc_stderr": 0.004427045987265163
},
"harness|winogrande|5": {
"acc": 0.6440410418310971,
"acc_stderr": 0.013456740656273952
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TehVenom__PPO_Pygway-V8p4_Dev-6b
|
[
"region:us"
] |
2023-08-17T23:11:49+00:00
|
{"pretty_name": "Evaluation run of TehVenom/PPO_Pygway-V8p4_Dev-6b", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/PPO_Pygway-V8p4_Dev-6b](https://huggingface.co/TehVenom/PPO_Pygway-V8p4_Dev-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__PPO_Pygway-V8p4_Dev-6b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T01:37:21.951842](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__PPO_Pygway-V8p4_Dev-6b/blob/main/results_2023-10-17T01-37-21.951842.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606367,\n \"f1\": 0.05445260067114113,\n \"f1_stderr\": 0.0012869911657351386,\n \"acc\": 0.3352881479056926,\n \"acc_stderr\": 0.008941893321769558\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606367,\n \"f1\": 0.05445260067114113,\n \"f1_stderr\": 0.0012869911657351386\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \"acc_stderr\": 0.004427045987265163\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6440410418310971,\n \"acc_stderr\": 0.013456740656273952\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/PPO_Pygway-V8p4_Dev-6b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T01_37_21.951842", "path": ["**/details_harness|drop|3_2023-10-17T01-37-21.951842.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T01-37-21.951842.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T01_37_21.951842", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-37-21.951842.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-37-21.951842.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:50:24.593524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:50:24.593524.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:50:24.593524.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T01_37_21.951842", "path": ["**/details_harness|winogrande|5_2023-10-17T01-37-21.951842.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T01-37-21.951842.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_50_24.593524", "path": ["results_2023-07-19T15:50:24.593524.parquet"]}, {"split": "2023_10_17T01_37_21.951842", "path": ["results_2023-10-17T01-37-21.951842.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T01-37-21.951842.parquet"]}]}]}
|
2023-10-17T00:37:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TehVenom/PPO_Pygway-V8p4_Dev-6b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TehVenom/PPO_Pygway-V8p4_Dev-6b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T01:37:21.951842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TehVenom/PPO_Pygway-V8p4_Dev-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Pygway-V8p4_Dev-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T01:37:21.951842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TehVenom/PPO_Pygway-V8p4_Dev-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Pygway-V8p4_Dev-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T01:37:21.951842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
28,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/PPO_Pygway-V8p4_Dev-6b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/PPO_Pygway-V8p4_Dev-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T01:37:21.951842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
edd1bc5764f8fc1a0154df71ef8ddc1efb929319
|
# Dataset Card for Evaluation run of TehVenom/Dolly_Malion-6b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/Dolly_Malion-6b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/Dolly_Malion-6b](https://huggingface.co/TehVenom/Dolly_Malion-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__Dolly_Malion-6b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T18:43:13.792642](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Dolly_Malion-6b/blob/main/results_2023-10-15T18-43-13.792642.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413333,
"f1": 0.049715813758389356,
"f1_stderr": 0.0012233418239437058,
"acc": 0.33586947611049245,
"acc_stderr": 0.008486041909966314
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413333,
"f1": 0.049715813758389356,
"f1_stderr": 0.0012233418239437058
},
"harness|gsm8k|5": {
"acc": 0.017437452615617893,
"acc_stderr": 0.0036054868679982525
},
"harness|winogrande|5": {
"acc": 0.654301499605367,
"acc_stderr": 0.013366596951934376
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TehVenom__Dolly_Malion-6b
|
[
"region:us"
] |
2023-08-17T23:11:58+00:00
|
{"pretty_name": "Evaluation run of TehVenom/Dolly_Malion-6b", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/Dolly_Malion-6b](https://huggingface.co/TehVenom/Dolly_Malion-6b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__Dolly_Malion-6b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T18:43:13.792642](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Dolly_Malion-6b/blob/main/results_2023-10-15T18-43-13.792642.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413333,\n \"f1\": 0.049715813758389356,\n \"f1_stderr\": 0.0012233418239437058,\n \"acc\": 0.33586947611049245,\n \"acc_stderr\": 0.008486041909966314\n },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413333,\n \"f1\": 0.049715813758389356,\n \"f1_stderr\": 0.0012233418239437058\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.017437452615617893,\n \"acc_stderr\": 0.0036054868679982525\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.013366596951934376\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/Dolly_Malion-6b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T18_43_13.792642", "path": ["**/details_harness|drop|3_2023-10-15T18-43-13.792642.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T18-43-13.792642.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T18_43_13.792642", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-43-13.792642.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-43-13.792642.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:03:49.515297.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:03:49.515297.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:03:49.515297.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T18_43_13.792642", "path": ["**/details_harness|winogrande|5_2023-10-15T18-43-13.792642.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T18-43-13.792642.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_03_49.515297", "path": ["results_2023-07-19T16:03:49.515297.parquet"]}, {"split": "2023_10_15T18_43_13.792642", "path": ["results_2023-10-15T18-43-13.792642.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T18-43-13.792642.parquet"]}]}]}
|
2023-10-15T17:43:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TehVenom/Dolly_Malion-6b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TehVenom/Dolly_Malion-6b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T18:43:13.792642(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TehVenom/Dolly_Malion-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Dolly_Malion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T18:43:13.792642(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TehVenom/Dolly_Malion-6b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Dolly_Malion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T18:43:13.792642(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/Dolly_Malion-6b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Dolly_Malion-6b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T18:43:13.792642(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e1e32933ea6b1aa104fd196687ed7e6c26a8694f
|
# Dataset Card for Evaluation run of TehVenom/ChanMalion
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/ChanMalion
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/ChanMalion](https://huggingface.co/TehVenom/ChanMalion) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__ChanMalion",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T12:15:40.310969](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__ChanMalion/blob/main/results_2023-10-15T12-15-40.310969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.04850041946308733,
"f1_stderr": 0.0011972806992898283,
"acc": 0.3350957680623131,
"acc_stderr": 0.008450684650204038
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.00029649629898012493,
"f1": 0.04850041946308733,
"f1_stderr": 0.0011972806992898283
},
"harness|gsm8k|5": {
"acc": 0.016679302501895376,
"acc_stderr": 0.0035275958887224334
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685644
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TehVenom__ChanMalion
|
[
"region:us"
] |
2023-08-17T23:12:07+00:00
|
{"pretty_name": "Evaluation run of TehVenom/ChanMalion", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/ChanMalion](https://huggingface.co/TehVenom/ChanMalion) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__ChanMalion\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T12:15:40.310969](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__ChanMalion/blob/main/results_2023-10-15T12-15-40.310969.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012493,\n \"f1\": 0.04850041946308733,\n \"f1_stderr\": 0.0011972806992898283,\n \"acc\": 0.3350957680623131,\n \"acc_stderr\": 0.008450684650204038\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.00029649629898012493,\n \"f1\": 0.04850041946308733,\n \"f1_stderr\": 0.0011972806992898283\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.016679302501895376,\n \"acc_stderr\": 0.0035275958887224334\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685644\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/ChanMalion", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T12_15_40.310969", "path": ["**/details_harness|drop|3_2023-10-15T12-15-40.310969.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T12-15-40.310969.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T12_15_40.310969", "path": ["**/details_harness|gsm8k|5_2023-10-15T12-15-40.310969.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T12-15-40.310969.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:18:28.111835.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:18:28.111835.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:18:28.111835.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T12_15_40.310969", "path": ["**/details_harness|winogrande|5_2023-10-15T12-15-40.310969.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T12-15-40.310969.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_18_28.111835", "path": ["results_2023-07-19T19:18:28.111835.parquet"]}, {"split": "2023_10_15T12_15_40.310969", "path": ["results_2023-10-15T12-15-40.310969.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T12-15-40.310969.parquet"]}]}]}
|
2023-10-15T11:15:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TehVenom/ChanMalion
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TehVenom/ChanMalion on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T12:15:40.310969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TehVenom/ChanMalion",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/ChanMalion on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T12:15:40.310969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TehVenom/ChanMalion",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/ChanMalion on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T12:15:40.310969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
164,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/ChanMalion## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/ChanMalion on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T12:15:40.310969(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a0eb73bacfcc9f9f12727348d6242556133196d3
|
# Dataset Card for Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TehVenom/Pygmalion_AlpacaLora-7b](https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T13:29:21.938536](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b/blob/main/results_2023-10-18T13-29-21.938536.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.25817953020134227,
"em_stderr": 0.004481774083922211,
"f1": 0.3091002516778529,
"f1_stderr": 0.0044655983468890985,
"acc": 0.367154387965818,
"acc_stderr": 0.007802106213381273
},
"harness|drop|3": {
"em": 0.25817953020134227,
"em_stderr": 0.004481774083922211,
"f1": 0.3091002516778529,
"f1_stderr": 0.0044655983468890985
},
"harness|gsm8k|5": {
"acc": 0.012130401819560273,
"acc_stderr": 0.003015294242890945
},
"harness|winogrande|5": {
"acc": 0.7221783741120757,
"acc_stderr": 0.012588918183871601
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b
|
[
"region:us"
] |
2023-08-17T23:12:16+00:00
|
{"pretty_name": "Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [TehVenom/Pygmalion_AlpacaLora-7b](https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T13:29:21.938536](https://huggingface.co/datasets/open-llm-leaderboard/details_TehVenom__Pygmalion_AlpacaLora-7b/blob/main/results_2023-10-18T13-29-21.938536.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.25817953020134227,\n \"em_stderr\": 0.004481774083922211,\n \"f1\": 0.3091002516778529,\n \"f1_stderr\": 0.0044655983468890985,\n \"acc\": 0.367154387965818,\n \"acc_stderr\": 0.007802106213381273\n },\n \"harness|drop|3\": {\n \"em\": 0.25817953020134227,\n \"em_stderr\": 0.004481774083922211,\n \"f1\": 0.3091002516778529,\n \"f1_stderr\": 0.0044655983468890985\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.012130401819560273,\n \"acc_stderr\": 0.003015294242890945\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7221783741120757,\n \"acc_stderr\": 0.012588918183871601\n }\n}\n```", "repo_url": "https://huggingface.co/TehVenom/Pygmalion_AlpacaLora-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T13_29_21.938536", "path": ["**/details_harness|drop|3_2023-10-18T13-29-21.938536.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T13-29-21.938536.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T13_29_21.938536", "path": ["**/details_harness|gsm8k|5_2023-10-18T13-29-21.938536.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T13-29-21.938536.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:17:50.932996.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:17:50.932996.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T13_29_21.938536", "path": ["**/details_harness|winogrande|5_2023-10-18T13-29-21.938536.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T13-29-21.938536.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_17_50.932996", "path": ["results_2023-07-19T16:17:50.932996.parquet"]}, {"split": "2023_10_18T13_29_21.938536", "path": ["results_2023-10-18T13-29-21.938536.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T13-29-21.938536.parquet"]}]}]}
|
2023-10-18T12:29:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TehVenom/Pygmalion_AlpacaLora-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T13:29:21.938536(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Pygmalion_AlpacaLora-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T13:29:21.938536(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Pygmalion_AlpacaLora-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T13:29:21.938536(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TehVenom/Pygmalion_AlpacaLora-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TehVenom/Pygmalion_AlpacaLora-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T13:29:21.938536(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
75baab330fd25e67be35ea483d631fa0c5b7e763
|
# Dataset Card for Evaluation run of MetaIX/GPT4-X-Alpasta-30b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MetaIX/GPT4-X-Alpasta-30b](https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:07:45.972235](https://huggingface.co/datasets/open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b/blob/main/results_2023-09-17T08-07-45.972235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.31312919463087246,
"em_stderr": 0.00474940232599683,
"f1": 0.4037961409395989,
"f1_stderr": 0.0045737911370298204,
"acc": 0.5434694672544375,
"acc_stderr": 0.012140181814727365
},
"harness|drop|3": {
"em": 0.31312919463087246,
"em_stderr": 0.00474940232599683,
"f1": 0.4037961409395989,
"f1_stderr": 0.0045737911370298204
},
"harness|gsm8k|5": {
"acc": 0.30477634571645185,
"acc_stderr": 0.012679297549515406
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b
|
[
"region:us"
] |
2023-08-17T23:12:25+00:00
|
{"pretty_name": "Evaluation run of MetaIX/GPT4-X-Alpasta-30b", "dataset_summary": "Dataset automatically created during the evaluation run of model [MetaIX/GPT4-X-Alpasta-30b](https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T08:07:45.972235](https://huggingface.co/datasets/open-llm-leaderboard/details_MetaIX__GPT4-X-Alpasta-30b/blob/main/results_2023-09-17T08-07-45.972235.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.31312919463087246,\n \"em_stderr\": 0.00474940232599683,\n \"f1\": 0.4037961409395989,\n \"f1_stderr\": 0.0045737911370298204,\n \"acc\": 0.5434694672544375,\n \"acc_stderr\": 0.012140181814727365\n },\n \"harness|drop|3\": {\n \"em\": 0.31312919463087246,\n \"em_stderr\": 0.00474940232599683,\n \"f1\": 0.4037961409395989,\n \"f1_stderr\": 0.0045737911370298204\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.30477634571645185,\n \"acc_stderr\": 0.012679297549515406\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n }\n}\n```", "repo_url": "https://huggingface.co/MetaIX/GPT4-X-Alpasta-30b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T08_07_45.972235", "path": ["**/details_harness|drop|3_2023-09-17T08-07-45.972235.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T08-07-45.972235.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T08_07_45.972235", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-07-45.972235.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T08-07-45.972235.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:29:11.642048.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T22:29:11.642048.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T08_07_45.972235", "path": ["**/details_harness|winogrande|5_2023-09-17T08-07-45.972235.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T08-07-45.972235.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T22_29_11.642048", "path": ["results_2023-07-19T22:29:11.642048.parquet"]}, {"split": "2023_09_17T08_07_45.972235", "path": ["results_2023-09-17T08-07-45.972235.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T08-07-45.972235.parquet"]}]}]}
|
2023-09-17T07:07:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MetaIX/GPT4-X-Alpasta-30b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MetaIX/GPT4-X-Alpasta-30b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T08:07:45.972235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MetaIX/GPT4-X-Alpasta-30b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MetaIX/GPT4-X-Alpasta-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T08:07:45.972235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MetaIX/GPT4-X-Alpasta-30b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MetaIX/GPT4-X-Alpasta-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T08:07:45.972235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MetaIX/GPT4-X-Alpasta-30b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MetaIX/GPT4-X-Alpasta-30b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T08:07:45.972235(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
1bc1a643e57ba5b96fc8cce4e9ade85e04f82d5e
|
# Dataset Card for Evaluation run of tiiuae/falcon-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tiiuae/falcon-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [tiiuae/falcon-7b](https://huggingface.co/tiiuae/falcon-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tiiuae__falcon-7b",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T17:58:16.188347](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-7b/blob/main/results_2023-12-03T17-58-16.188347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.04624715693707354,
"acc_stderr": 0.005784991662691836
},
"harness|gsm8k|5": {
"acc": 0.04624715693707354,
"acc_stderr": 0.005784991662691836
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_tiiuae__falcon-7b
|
[
"region:us"
] |
2023-08-17T23:12:34+00:00
|
{"pretty_name": "Evaluation run of tiiuae/falcon-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [tiiuae/falcon-7b](https://huggingface.co/tiiuae/falcon-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tiiuae__falcon-7b\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T17:58:16.188347](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-7b/blob/main/results_2023-12-03T17-58-16.188347.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.04624715693707354,\n \"acc_stderr\": 0.005784991662691836\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04624715693707354,\n \"acc_stderr\": 0.005784991662691836\n }\n}\n```", "repo_url": "https://huggingface.co/tiiuae/falcon-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T10_06_45.584443", "path": ["**/details_harness|drop|3_2023-09-17T10-06-45.584443.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T10-06-45.584443.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T10_06_45.584443", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-06-45.584443.parquet"]}, {"split": "2023_12_03T17_58_16.188347", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-58-16.188347.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T17-58-16.188347.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T10:51:47.706539.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:51:47.706539.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T10:51:47.706539.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T10_06_45.584443", "path": ["**/details_harness|winogrande|5_2023-09-17T10-06-45.584443.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T10-06-45.584443.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:05:31.227903.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_05_31.227903", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:05:31.227903.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:05:31.227903.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T10_51_47.706539", "path": ["results_2023-07-19T10:51:47.706539.parquet"]}, {"split": "2023_08_28T20_05_31.227903", "path": ["results_2023-08-28T20:05:31.227903.parquet"]}, {"split": "2023_09_17T10_06_45.584443", "path": ["results_2023-09-17T10-06-45.584443.parquet"]}, {"split": "2023_12_03T17_58_16.188347", "path": ["results_2023-12-03T17-58-16.188347.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T17-58-16.188347.parquet"]}]}]}
|
2023-12-03T17:58:23+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of tiiuae/falcon-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model tiiuae/falcon-7b on the Open LLM Leaderboard.
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-03T17:58:16.188347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of tiiuae/falcon-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model tiiuae/falcon-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T17:58:16.188347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tiiuae/falcon-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model tiiuae/falcon-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T17:58:16.188347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of tiiuae/falcon-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model tiiuae/falcon-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T17:58:16.188347(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a4d9c95df2e36a244a380d34f3fb88fcc38274c2
|
# Dataset Card for Evaluation run of l3utterfly/llama2-7b-layla
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/l3utterfly/llama2-7b-layla
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [l3utterfly/llama2-7b-layla](https://huggingface.co/l3utterfly/llama2-7b-layla) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_l3utterfly__llama2-7b-layla",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T04:32:53.780547](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__llama2-7b-layla/blob/main/results_2023-09-17T04-32-53.780547.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514622,
"f1": 0.06570889261744958,
"f1_stderr": 0.0014756748283544432,
"acc": 0.4130167852161326,
"acc_stderr": 0.009994364317722083
},
"harness|drop|3": {
"em": 0.0025167785234899327,
"em_stderr": 0.0005131152834514622,
"f1": 0.06570889261744958,
"f1_stderr": 0.0014756748283544432
},
"harness|gsm8k|5": {
"acc": 0.08491281273692192,
"acc_stderr": 0.007678212824450795
},
"harness|winogrande|5": {
"acc": 0.7411207576953434,
"acc_stderr": 0.012310515810993372
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_l3utterfly__llama2-7b-layla
|
[
"region:us"
] |
2023-08-17T23:12:44+00:00
|
{"pretty_name": "Evaluation run of l3utterfly/llama2-7b-layla", "dataset_summary": "Dataset automatically created during the evaluation run of model [l3utterfly/llama2-7b-layla](https://huggingface.co/l3utterfly/llama2-7b-layla) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_l3utterfly__llama2-7b-layla\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T04:32:53.780547](https://huggingface.co/datasets/open-llm-leaderboard/details_l3utterfly__llama2-7b-layla/blob/main/results_2023-09-17T04-32-53.780547.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514622,\n \"f1\": 0.06570889261744958,\n \"f1_stderr\": 0.0014756748283544432,\n \"acc\": 0.4130167852161326,\n \"acc_stderr\": 0.009994364317722083\n },\n \"harness|drop|3\": {\n \"em\": 0.0025167785234899327,\n \"em_stderr\": 0.0005131152834514622,\n \"f1\": 0.06570889261744958,\n \"f1_stderr\": 0.0014756748283544432\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08491281273692192,\n \"acc_stderr\": 0.007678212824450795\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7411207576953434,\n \"acc_stderr\": 0.012310515810993372\n }\n}\n```", "repo_url": "https://huggingface.co/l3utterfly/llama2-7b-layla", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T04_32_53.780547", "path": ["**/details_harness|drop|3_2023-09-17T04-32-53.780547.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T04-32-53.780547.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T04_32_53.780547", "path": ["**/details_harness|gsm8k|5_2023-09-17T04-32-53.780547.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T04-32-53.780547.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T20:58:39.874596.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:58:39.874596.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T20:58:39.874596.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T04_32_53.780547", "path": ["**/details_harness|winogrande|5_2023-09-17T04-32-53.780547.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T04-32-53.780547.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T20_58_39.874596", "path": ["results_2023-08-09T20:58:39.874596.parquet"]}, {"split": "2023_09_17T04_32_53.780547", "path": ["results_2023-09-17T04-32-53.780547.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T04-32-53.780547.parquet"]}]}]}
|
2023-09-17T03:33:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of l3utterfly/llama2-7b-layla
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model l3utterfly/llama2-7b-layla on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T04:32:53.780547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of l3utterfly/llama2-7b-layla",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model l3utterfly/llama2-7b-layla on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T04:32:53.780547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of l3utterfly/llama2-7b-layla",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model l3utterfly/llama2-7b-layla on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T04:32:53.780547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of l3utterfly/llama2-7b-layla## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model l3utterfly/llama2-7b-layla on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T04:32:53.780547(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
4878578ec5827e9b7ac1a607838997fc22f0c79f
|
# Dataset Card for Evaluation run of wenge-research/yayi-7b-llama2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wenge-research/yayi-7b-llama2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [wenge-research/yayi-7b-llama2](https://huggingface.co/wenge-research/yayi-7b-llama2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wenge-research__yayi-7b-llama2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T15:31:48.459973](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-7b-llama2/blob/main/results_2023-09-23T15-31-48.459973.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038788,
"f1": 0.06044777684563769,
"f1_stderr": 0.0013803417345618757,
"acc": 0.40589214880805274,
"acc_stderr": 0.009561073756915776
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.0004320097346038788,
"f1": 0.06044777684563769,
"f1_stderr": 0.0013803417345618757
},
"harness|gsm8k|5": {
"acc": 0.0667172100075815,
"acc_stderr": 0.006873340544455132
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_wenge-research__yayi-7b-llama2
|
[
"region:us"
] |
2023-08-17T23:12:52+00:00
|
{"pretty_name": "Evaluation run of wenge-research/yayi-7b-llama2", "dataset_summary": "Dataset automatically created during the evaluation run of model [wenge-research/yayi-7b-llama2](https://huggingface.co/wenge-research/yayi-7b-llama2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wenge-research__yayi-7b-llama2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T15:31:48.459973](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-7b-llama2/blob/main/results_2023-09-23T15-31-48.459973.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346038788,\n \"f1\": 0.06044777684563769,\n \"f1_stderr\": 0.0013803417345618757,\n \"acc\": 0.40589214880805274,\n \"acc_stderr\": 0.009561073756915776\n },\n \"harness|drop|3\": {\n \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.0004320097346038788,\n \"f1\": 0.06044777684563769,\n \"f1_stderr\": 0.0013803417345618757\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0667172100075815,\n \"acc_stderr\": 0.006873340544455132\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n }\n}\n```", "repo_url": "https://huggingface.co/wenge-research/yayi-7b-llama2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|arc:challenge|25_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|arc:challenge|25_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T09_26_06.826895", "path": ["**/details_harness|drop|3_2023-09-23T09-26-06.826895.parquet"]}, {"split": "2023_09_23T15_31_48.459973", "path": ["**/details_harness|drop|3_2023-09-23T15-31-48.459973.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T15-31-48.459973.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T09_26_06.826895", "path": ["**/details_harness|gsm8k|5_2023-09-23T09-26-06.826895.parquet"]}, {"split": "2023_09_23T15_31_48.459973", "path": ["**/details_harness|gsm8k|5_2023-09-23T15-31-48.459973.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T15-31-48.459973.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hellaswag|10_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hellaswag|10_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T11:30:18.986210.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T10:53:28.461116.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T10:53:28.461116.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T10:53:28.461116.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T09_26_06.826895", "path": ["**/details_harness|winogrande|5_2023-09-23T09-26-06.826895.parquet"]}, {"split": "2023_09_23T15_31_48.459973", "path": ["**/details_harness|winogrande|5_2023-09-23T15-31-48.459973.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T15-31-48.459973.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T11_30_18.986210", "path": ["results_2023-07-24T11:30:18.986210.parquet"]}, {"split": "2023_07_27T10_53_28.461116", "path": ["results_2023-07-27T10:53:28.461116.parquet"]}, {"split": "2023_09_23T09_26_06.826895", "path": ["results_2023-09-23T09-26-06.826895.parquet"]}, {"split": "2023_09_23T15_31_48.459973", "path": ["results_2023-09-23T15-31-48.459973.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T15-31-48.459973.parquet"]}]}]}
|
2023-09-23T14:32:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of wenge-research/yayi-7b-llama2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model wenge-research/yayi-7b-llama2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T15:31:48.459973(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of wenge-research/yayi-7b-llama2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-7b-llama2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T15:31:48.459973(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wenge-research/yayi-7b-llama2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-7b-llama2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T15:31:48.459973(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of wenge-research/yayi-7b-llama2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-7b-llama2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T15:31:48.459973(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
bbfe13a43feb4ee4af11a3d8c7e5b3d82e93902f
|
# Dataset Card for Evaluation run of wenge-research/yayi-13b-llama2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wenge-research/yayi-13b-llama2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [wenge-research/yayi-13b-llama2](https://huggingface.co/wenge-research/yayi-13b-llama2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wenge-research__yayi-13b-llama2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T08:54:37.748891](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-13b-llama2/blob/main/results_2023-10-15T08-54-37.748891.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642966022,
"f1": 0.05916107382550354,
"f1_stderr": 0.0014083828571043837,
"acc": 0.3685519093475062,
"acc_stderr": 0.009163083599802495
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642966022,
"f1": 0.05916107382550354,
"f1_stderr": 0.0014083828571043837
},
"harness|gsm8k|5": {
"acc": 0.0401819560272934,
"acc_stderr": 0.00540943973697052
},
"harness|winogrande|5": {
"acc": 0.696921862667719,
"acc_stderr": 0.012916727462634468
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_wenge-research__yayi-13b-llama2
|
[
"region:us"
] |
2023-08-17T23:13:10+00:00
|
{"pretty_name": "Evaluation run of wenge-research/yayi-13b-llama2", "dataset_summary": "Dataset automatically created during the evaluation run of model [wenge-research/yayi-13b-llama2](https://huggingface.co/wenge-research/yayi-13b-llama2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wenge-research__yayi-13b-llama2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T08:54:37.748891](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-13b-llama2/blob/main/results_2023-10-15T08-54-37.748891.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642966022,\n \"f1\": 0.05916107382550354,\n \"f1_stderr\": 0.0014083828571043837,\n \"acc\": 0.3685519093475062,\n \"acc_stderr\": 0.009163083599802495\n },\n \"harness|drop|3\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642966022,\n \"f1\": 0.05916107382550354,\n \"f1_stderr\": 0.0014083828571043837\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \"acc_stderr\": 0.00540943973697052\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.696921862667719,\n \"acc_stderr\": 0.012916727462634468\n }\n}\n```", "repo_url": "https://huggingface.co/wenge-research/yayi-13b-llama2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|arc:challenge|25_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|arc:challenge|25_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T08_54_37.748891", "path": ["**/details_harness|drop|3_2023-10-15T08-54-37.748891.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T08-54-37.748891.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T08_54_37.748891", "path": ["**/details_harness|gsm8k|5_2023-10-15T08-54-37.748891.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T08-54-37.748891.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hellaswag|10_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hellaswag|10_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-27T11:25:49.235330.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-01T13-33-23.488564.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-01T13-33-23.488564.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-01T13-33-23.488564.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T08_54_37.748891", "path": ["**/details_harness|winogrande|5_2023-10-15T08-54-37.748891.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T08-54-37.748891.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_27T11_25_49.235330", "path": ["results_2023-07-27T11:25:49.235330.parquet"]}, {"split": "2023_10_01T13_33_23.488564", "path": ["results_2023-10-01T13-33-23.488564.parquet"]}, {"split": "2023_10_15T08_54_37.748891", "path": ["results_2023-10-15T08-54-37.748891.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T08-54-37.748891.parquet"]}]}]}
|
2023-10-15T07:54:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of wenge-research/yayi-13b-llama2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model wenge-research/yayi-13b-llama2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T08:54:37.748891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of wenge-research/yayi-13b-llama2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-13b-llama2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T08:54:37.748891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wenge-research/yayi-13b-llama2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-13b-llama2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T08:54:37.748891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of wenge-research/yayi-13b-llama2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-13b-llama2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T08:54:37.748891(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
28708e44a493e5c4efd3f836f151903f23464a46
|
# Dataset Card for Evaluation run of wenge-research/yayi-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/wenge-research/yayi-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [wenge-research/yayi-7b](https://huggingface.co/wenge-research/yayi-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wenge-research__yayi-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T06:30:46.171350](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-7b/blob/main/results_2023-09-23T06-30-46.171350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06774328859060402,
"em_stderr": 0.0025735970400074747,
"f1": 0.14674916107382516,
"f1_stderr": 0.0029879145056579317,
"acc": 0.31591433083229564,
"acc_stderr": 0.008118947219787587
},
"harness|drop|3": {
"em": 0.06774328859060402,
"em_stderr": 0.0025735970400074747,
"f1": 0.14674916107382516,
"f1_stderr": 0.0029879145056579317
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775673
},
"harness|winogrande|5": {
"acc": 0.6227308602999211,
"acc_stderr": 0.013622567928799501
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_wenge-research__yayi-7b
|
[
"region:us"
] |
2023-08-17T23:13:19+00:00
|
{"pretty_name": "Evaluation run of wenge-research/yayi-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [wenge-research/yayi-7b](https://huggingface.co/wenge-research/yayi-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wenge-research__yayi-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T06:30:46.171350](https://huggingface.co/datasets/open-llm-leaderboard/details_wenge-research__yayi-7b/blob/main/results_2023-09-23T06-30-46.171350.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06774328859060402,\n \"em_stderr\": 0.0025735970400074747,\n \"f1\": 0.14674916107382516,\n \"f1_stderr\": 0.0029879145056579317,\n \"acc\": 0.31591433083229564,\n \"acc_stderr\": 0.008118947219787587\n },\n \"harness|drop|3\": {\n \"em\": 0.06774328859060402,\n \"em_stderr\": 0.0025735970400074747,\n \"f1\": 0.14674916107382516,\n \"f1_stderr\": 0.0029879145056579317\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \"acc_stderr\": 0.002615326510775673\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6227308602999211,\n \"acc_stderr\": 0.013622567928799501\n }\n}\n```", "repo_url": "https://huggingface.co/wenge-research/yayi-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|arc:challenge|25_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T06_30_46.171350", "path": ["**/details_harness|drop|3_2023-09-23T06-30-46.171350.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T06-30-46.171350.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T06_30_46.171350", "path": ["**/details_harness|gsm8k|5_2023-09-23T06-30-46.171350.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T06-30-46.171350.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hellaswag|10_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T15:19:24.431670.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T15:19:24.431670.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T15:19:24.431670.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T06_30_46.171350", "path": ["**/details_harness|winogrande|5_2023-09-23T06-30-46.171350.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T06-30-46.171350.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T15_19_24.431670", "path": ["results_2023-08-01T15:19:24.431670.parquet"]}, {"split": "2023_09_23T06_30_46.171350", "path": ["results_2023-09-23T06-30-46.171350.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T06-30-46.171350.parquet"]}]}]}
|
2023-09-23T05:30:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of wenge-research/yayi-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model wenge-research/yayi-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T06:30:46.171350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of wenge-research/yayi-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T06:30:46.171350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of wenge-research/yayi-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T06:30:46.171350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of wenge-research/yayi-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model wenge-research/yayi-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T06:30:46.171350(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3504a536181ec34d088013469d96f5b058a85399
|
# Dataset Card for Evaluation run of Salesforce/codegen-6B-multi
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Salesforce/codegen-6B-multi
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Salesforce/codegen-6B-multi](https://huggingface.co/Salesforce/codegen-6B-multi) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Salesforce__codegen-6B-multi",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T16:03:24.243188](https://huggingface.co/datasets/open-llm-leaderboard/details_Salesforce__codegen-6B-multi/blob/main/results_2023-09-17T16-03-24.243188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219097,
"f1": 0.04059878355704704,
"f1_stderr": 0.0011641328961688674,
"acc": 0.274462308809441,
"acc_stderr": 0.008365299129010984
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.0003314581465219097,
"f1": 0.04059878355704704,
"f1_stderr": 0.0011641328961688674
},
"harness|gsm8k|5": {
"acc": 0.009855951478392721,
"acc_stderr": 0.0027210765770416634
},
"harness|winogrande|5": {
"acc": 0.5390686661404893,
"acc_stderr": 0.014009521680980306
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Salesforce__codegen-6B-multi
|
[
"region:us"
] |
2023-08-17T23:13:28+00:00
|
{"pretty_name": "Evaluation run of Salesforce/codegen-6B-multi", "dataset_summary": "Dataset automatically created during the evaluation run of model [Salesforce/codegen-6B-multi](https://huggingface.co/Salesforce/codegen-6B-multi) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Salesforce__codegen-6B-multi\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T16:03:24.243188](https://huggingface.co/datasets/open-llm-leaderboard/details_Salesforce__codegen-6B-multi/blob/main/results_2023-09-17T16-03-24.243188.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219097,\n \"f1\": 0.04059878355704704,\n \"f1_stderr\": 0.0011641328961688674,\n \"acc\": 0.274462308809441,\n \"acc_stderr\": 0.008365299129010984\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.0003314581465219097,\n \"f1\": 0.04059878355704704,\n \"f1_stderr\": 0.0011641328961688674\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009855951478392721,\n \"acc_stderr\": 0.0027210765770416634\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5390686661404893,\n \"acc_stderr\": 0.014009521680980306\n }\n}\n```", "repo_url": "https://huggingface.co/Salesforce/codegen-6B-multi", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T16_03_24.243188", "path": ["**/details_harness|drop|3_2023-09-17T16-03-24.243188.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T16-03-24.243188.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T16_03_24.243188", "path": ["**/details_harness|gsm8k|5_2023-09-17T16-03-24.243188.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T16-03-24.243188.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:45:31.185140.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:45:31.185140.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:45:31.185140.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T16_03_24.243188", "path": ["**/details_harness|winogrande|5_2023-09-17T16-03-24.243188.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T16-03-24.243188.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_45_31.185140", "path": ["results_2023-07-19T15:45:31.185140.parquet"]}, {"split": "2023_09_17T16_03_24.243188", "path": ["results_2023-09-17T16-03-24.243188.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T16-03-24.243188.parquet"]}]}]}
|
2023-09-17T15:03:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Salesforce/codegen-6B-multi
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Salesforce/codegen-6B-multi on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T16:03:24.243188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Salesforce/codegen-6B-multi",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-6B-multi on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T16:03:24.243188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Salesforce/codegen-6B-multi",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-6B-multi on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T16:03:24.243188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Salesforce/codegen-6B-multi## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-6B-multi on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T16:03:24.243188(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a483fea7f6631d832373ed1d67d415e1158a33d2
|
# Dataset Card for Evaluation run of Salesforce/codegen-16B-nl
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Salesforce/codegen-16B-nl
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Salesforce/codegen-16B-nl](https://huggingface.co/Salesforce/codegen-16B-nl) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Salesforce__codegen-16B-nl",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T21:34:04.587307](https://huggingface.co/datasets/open-llm-leaderboard/details_Salesforce__codegen-16B-nl/blob/main/results_2023-09-16T21-34-04.587307.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190987,
"f1": 0.050115352348993385,
"f1_stderr": 0.0012004040103300048,
"acc": 0.35304663251500595,
"acc_stderr": 0.00877106572247344
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931190987,
"f1": 0.050115352348993385,
"f1_stderr": 0.0012004040103300048
},
"harness|gsm8k|5": {
"acc": 0.026535253980288095,
"acc_stderr": 0.004427045987265165
},
"harness|winogrande|5": {
"acc": 0.6795580110497238,
"acc_stderr": 0.013115085457681714
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Salesforce__codegen-16B-nl
|
[
"region:us"
] |
2023-08-17T23:13:39+00:00
|
{"pretty_name": "Evaluation run of Salesforce/codegen-16B-nl", "dataset_summary": "Dataset automatically created during the evaluation run of model [Salesforce/codegen-16B-nl](https://huggingface.co/Salesforce/codegen-16B-nl) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Salesforce__codegen-16B-nl\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T21:34:04.587307](https://huggingface.co/datasets/open-llm-leaderboard/details_Salesforce__codegen-16B-nl/blob/main/results_2023-09-16T21-34-04.587307.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190987,\n \"f1\": 0.050115352348993385,\n \"f1_stderr\": 0.0012004040103300048,\n \"acc\": 0.35304663251500595,\n \"acc_stderr\": 0.00877106572247344\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931190987,\n \"f1\": 0.050115352348993385,\n \"f1_stderr\": 0.0012004040103300048\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.026535253980288095,\n \"acc_stderr\": 0.004427045987265165\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6795580110497238,\n \"acc_stderr\": 0.013115085457681714\n }\n}\n```", "repo_url": "https://huggingface.co/Salesforce/codegen-16B-nl", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T21_34_04.587307", "path": ["**/details_harness|drop|3_2023-09-16T21-34-04.587307.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T21-34-04.587307.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T21_34_04.587307", "path": ["**/details_harness|gsm8k|5_2023-09-16T21-34-04.587307.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T21-34-04.587307.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T20:43:28.351317.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:43:28.351317.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T20:43:28.351317.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T21_34_04.587307", "path": ["**/details_harness|winogrande|5_2023-09-16T21-34-04.587307.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T21-34-04.587307.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:anatomy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:astronomy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:business_ethics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_biology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_chemistry|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_computer_science|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_mathematics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_medicine|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:college_physics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:computer_security|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:econometrics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:formal_logic|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:global_facts|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_biology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_geography|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_physics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:human_aging|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:human_sexuality|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:international_law|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:jurisprudence|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:machine_learning|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:management|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:marketing|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:medical_genetics|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:miscellaneous|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:moral_disputes|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:nutrition|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:philosophy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:prehistory|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_accounting|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_law|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_medicine|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:professional_psychology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:public_relations|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:security_studies|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:sociology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:virology|5_2023-08-28T20:44:03.051617.parquet", "**/details_original|mmlu:world_religions|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_abstract_algebra_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:abstract_algebra|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_anatomy_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:anatomy|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_astronomy_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:astronomy|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_business_ethics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:business_ethics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_clinical_knowledge_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:clinical_knowledge|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_college_biology_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_biology|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_college_chemistry_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_chemistry|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_college_computer_science_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_computer_science|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_college_mathematics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_mathematics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_college_medicine_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_medicine|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_college_physics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:college_physics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_computer_security_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:computer_security|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_conceptual_physics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:conceptual_physics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_econometrics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:econometrics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_electrical_engineering_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:electrical_engineering|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_elementary_mathematics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:elementary_mathematics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_formal_logic_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:formal_logic|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_global_facts_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:global_facts|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_biology_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_biology|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_chemistry_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_chemistry|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_computer_science_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_computer_science|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_european_history_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_european_history|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_geography_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_geography|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_macroeconomics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_mathematics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_mathematics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_microeconomics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_microeconomics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_physics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_physics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_psychology_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_psychology|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_statistics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_statistics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_us_history_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_us_history|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_high_school_world_history_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_world_history|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_human_aging_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_aging|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_human_sexuality_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:human_sexuality|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_international_law_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:international_law|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_jurisprudence_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:jurisprudence|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_logical_fallacies_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:logical_fallacies|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_machine_learning_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:machine_learning|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_management_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:management|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_marketing_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:marketing|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_medical_genetics_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:medical_genetics|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_miscellaneous_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:miscellaneous|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_moral_disputes_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_disputes|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_moral_scenarios_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:moral_scenarios|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_nutrition_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:nutrition|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_philosophy_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:philosophy|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_prehistory_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:prehistory|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_professional_accounting_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_accounting|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_professional_law_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_law|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_professional_medicine_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_medicine|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_professional_psychology_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:professional_psychology|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_public_relations_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:public_relations|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_security_studies_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:security_studies|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_sociology_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:sociology|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_us_foreign_policy_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:us_foreign_policy|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_virology_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:virology|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "original_mmlu_world_religions_5", "data_files": [{"split": "2023_08_28T20_44_03.051617", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:44:03.051617.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:world_religions|5_2023-08-28T20:44:03.051617.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T20_43_28.351317", "path": ["results_2023-07-19T20:43:28.351317.parquet"]}, {"split": "2023_08_28T20_44_03.051617", "path": ["results_2023-08-28T20:44:03.051617.parquet"]}, {"split": "2023_09_16T21_34_04.587307", "path": ["results_2023-09-16T21-34-04.587307.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T21-34-04.587307.parquet"]}]}]}
|
2023-09-16T20:34:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Salesforce/codegen-16B-nl
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Salesforce/codegen-16B-nl on the Open LLM Leaderboard.
The dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-16T21:34:04.587307(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Salesforce/codegen-16B-nl",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-16B-nl on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T21:34:04.587307(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Salesforce/codegen-16B-nl",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-16B-nl on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T21:34:04.587307(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Salesforce/codegen-16B-nl## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-16B-nl on the Open LLM Leaderboard.\n\nThe dataset is composed of 122 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T21:34:04.587307(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a69b15bec645275ccf8893c8efcffd47b972e022
|
# Dataset Card for Evaluation run of Salesforce/codegen-6B-nl
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Salesforce/codegen-6B-nl
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Salesforce/codegen-6B-nl](https://huggingface.co/Salesforce/codegen-6B-nl) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Salesforce__codegen-6B-nl",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T04:29:16.787145](https://huggingface.co/datasets/open-llm-leaderboard/details_Salesforce__codegen-6B-nl/blob/main/results_2023-09-23T04-29-16.787145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801256,
"f1": 0.04463716442953032,
"f1_stderr": 0.0011286825965254675,
"acc": 0.34327415533879496,
"acc_stderr": 0.008654369331480736
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801256,
"f1": 0.04463716442953032,
"f1_stderr": 0.0011286825965254675
},
"harness|gsm8k|5": {
"acc": 0.021986353297952996,
"acc_stderr": 0.004039162758110055
},
"harness|winogrande|5": {
"acc": 0.664561957379637,
"acc_stderr": 0.013269575904851416
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Salesforce__codegen-6B-nl
|
[
"region:us"
] |
2023-08-17T23:13:52+00:00
|
{"pretty_name": "Evaluation run of Salesforce/codegen-6B-nl", "dataset_summary": "Dataset automatically created during the evaluation run of model [Salesforce/codegen-6B-nl](https://huggingface.co/Salesforce/codegen-6B-nl) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Salesforce__codegen-6B-nl\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T04:29:16.787145](https://huggingface.co/datasets/open-llm-leaderboard/details_Salesforce__codegen-6B-nl/blob/main/results_2023-09-23T04-29-16.787145.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801256,\n \"f1\": 0.04463716442953032,\n \"f1_stderr\": 0.0011286825965254675,\n \"acc\": 0.34327415533879496,\n \"acc_stderr\": 0.008654369331480736\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801256,\n \"f1\": 0.04463716442953032,\n \"f1_stderr\": 0.0011286825965254675\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.021986353297952996,\n \"acc_stderr\": 0.004039162758110055\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.664561957379637,\n \"acc_stderr\": 0.013269575904851416\n }\n}\n```", "repo_url": "https://huggingface.co/Salesforce/codegen-6B-nl", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T04_29_16.787145", "path": ["**/details_harness|drop|3_2023-09-23T04-29-16.787145.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T04-29-16.787145.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T04_29_16.787145", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-29-16.787145.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T04-29-16.787145.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:44.992291.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:42:44.992291.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:42:44.992291.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T04_29_16.787145", "path": ["**/details_harness|winogrande|5_2023-09-23T04-29-16.787145.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T04-29-16.787145.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_42_44.992291", "path": ["results_2023-07-19T15:42:44.992291.parquet"]}, {"split": "2023_09_23T04_29_16.787145", "path": ["results_2023-09-23T04-29-16.787145.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T04-29-16.787145.parquet"]}]}]}
|
2023-09-23T03:29:28+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Salesforce/codegen-6B-nl
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Salesforce/codegen-6B-nl on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T04:29:16.787145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Salesforce/codegen-6B-nl",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-6B-nl on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T04:29:16.787145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Salesforce/codegen-6B-nl",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-6B-nl on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T04:29:16.787145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Salesforce/codegen-6B-nl## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Salesforce/codegen-6B-nl on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T04:29:16.787145(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b9840e98ade3b8640196fa7941358e3c417d1b6c
|
# Dataset Card for Evaluation run of AlekseyKorshuk/pygmalion-6b-vicuna-chatml
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AlekseyKorshuk/pygmalion-6b-vicuna-chatml
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [AlekseyKorshuk/pygmalion-6b-vicuna-chatml](https://huggingface.co/AlekseyKorshuk/pygmalion-6b-vicuna-chatml) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AlekseyKorshuk__pygmalion-6b-vicuna-chatml",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:52:11.848314](https://huggingface.co/datasets/open-llm-leaderboard/details_AlekseyKorshuk__pygmalion-6b-vicuna-chatml/blob/main/results_2023-12-02T15-52-11.848314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.04397270659590599,
"acc_stderr": 0.005647666449126458
},
"harness|gsm8k|5": {
"acc": 0.04397270659590599,
"acc_stderr": 0.005647666449126458
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_AlekseyKorshuk__pygmalion-6b-vicuna-chatml
|
[
"region:us"
] |
2023-08-17T23:14:00+00:00
|
{"pretty_name": "Evaluation run of AlekseyKorshuk/pygmalion-6b-vicuna-chatml", "dataset_summary": "Dataset automatically created during the evaluation run of model [AlekseyKorshuk/pygmalion-6b-vicuna-chatml](https://huggingface.co/AlekseyKorshuk/pygmalion-6b-vicuna-chatml) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AlekseyKorshuk__pygmalion-6b-vicuna-chatml\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-02T15:52:11.848314](https://huggingface.co/datasets/open-llm-leaderboard/details_AlekseyKorshuk__pygmalion-6b-vicuna-chatml/blob/main/results_2023-12-02T15-52-11.848314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.04397270659590599,\n \"acc_stderr\": 0.005647666449126458\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04397270659590599,\n \"acc_stderr\": 0.005647666449126458\n }\n}\n```", "repo_url": "https://huggingface.co/AlekseyKorshuk/pygmalion-6b-vicuna-chatml", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|arc:challenge|25_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_09T13_36_28.958118", "path": ["**/details_harness|drop|3_2023-09-09T13-36-28.958118.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-09T13-36-28.958118.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_09T13_36_28.958118", "path": ["**/details_harness|gsm8k|5_2023-09-09T13-36-28.958118.parquet"]}, {"split": "2023_12_02T15_52_11.848314", "path": ["**/details_harness|gsm8k|5_2023-12-02T15-52-11.848314.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-02T15-52-11.848314.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hellaswag|10_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T10:58:39.640665.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:16:25.052724.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:16:25.052724.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:16:25.052724.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_09T13_36_28.958118", "path": ["**/details_harness|winogrande|5_2023-09-09T13-36-28.958118.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-09T13-36-28.958118.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T10_58_39.640665", "path": ["results_2023-07-24T10:58:39.640665.parquet"]}, {"split": "2023_08_01T14_16_25.052724", "path": ["results_2023-08-01T14:16:25.052724.parquet"]}, {"split": "2023_09_09T13_36_28.958118", "path": ["results_2023-09-09T13-36-28.958118.parquet"]}, {"split": "2023_12_02T15_52_04.252951", "path": ["results_2023-12-02T15-52-04.252951.parquet"]}, {"split": "2023_12_02T15_52_11.848314", "path": ["results_2023-12-02T15-52-11.848314.parquet"]}, {"split": "latest", "path": ["results_2023-12-02T15-52-11.848314.parquet"]}]}]}
|
2023-12-02T15:52:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of AlekseyKorshuk/pygmalion-6b-vicuna-chatml
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model AlekseyKorshuk/pygmalion-6b-vicuna-chatml on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-02T15:52:11.848314(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of AlekseyKorshuk/pygmalion-6b-vicuna-chatml",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlekseyKorshuk/pygmalion-6b-vicuna-chatml on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T15:52:11.848314(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AlekseyKorshuk/pygmalion-6b-vicuna-chatml",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlekseyKorshuk/pygmalion-6b-vicuna-chatml on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T15:52:11.848314(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
176,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AlekseyKorshuk/pygmalion-6b-vicuna-chatml## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlekseyKorshuk/pygmalion-6b-vicuna-chatml on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 5 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-02T15:52:11.848314(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3b88970001bd64433434a346611de9a29449f972
|
# Dataset Card for Evaluation run of AlekseyKorshuk/chatml-pyg-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AlekseyKorshuk/chatml-pyg-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [AlekseyKorshuk/chatml-pyg-v1](https://huggingface.co/AlekseyKorshuk/chatml-pyg-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AlekseyKorshuk__chatml-pyg-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T21:00:54.207494](https://huggingface.co/datasets/open-llm-leaderboard/details_AlekseyKorshuk__chatml-pyg-v1/blob/main/results_2023-09-16T21-00-54.207494.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06354865771812081,
"em_stderr": 0.0024982474364717406,
"f1": 0.11724203020134202,
"f1_stderr": 0.0027033976138729605,
"acc": 0.3383264329904803,
"acc_stderr": 0.009848216239525413
},
"harness|drop|3": {
"em": 0.06354865771812081,
"em_stderr": 0.0024982474364717406,
"f1": 0.11724203020134202,
"f1_stderr": 0.0027033976138729605
},
"harness|gsm8k|5": {
"acc": 0.05155420773313116,
"acc_stderr": 0.006090887955262826
},
"harness|winogrande|5": {
"acc": 0.6250986582478295,
"acc_stderr": 0.013605544523788001
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_AlekseyKorshuk__chatml-pyg-v1
|
[
"region:us"
] |
2023-08-17T23:14:11+00:00
|
{"pretty_name": "Evaluation run of AlekseyKorshuk/chatml-pyg-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [AlekseyKorshuk/chatml-pyg-v1](https://huggingface.co/AlekseyKorshuk/chatml-pyg-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AlekseyKorshuk__chatml-pyg-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-16T21:00:54.207494](https://huggingface.co/datasets/open-llm-leaderboard/details_AlekseyKorshuk__chatml-pyg-v1/blob/main/results_2023-09-16T21-00-54.207494.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06354865771812081,\n \"em_stderr\": 0.0024982474364717406,\n \"f1\": 0.11724203020134202,\n \"f1_stderr\": 0.0027033976138729605,\n \"acc\": 0.3383264329904803,\n \"acc_stderr\": 0.009848216239525413\n },\n \"harness|drop|3\": {\n \"em\": 0.06354865771812081,\n \"em_stderr\": 0.0024982474364717406,\n \"f1\": 0.11724203020134202,\n \"f1_stderr\": 0.0027033976138729605\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05155420773313116,\n \"acc_stderr\": 0.006090887955262826\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6250986582478295,\n \"acc_stderr\": 0.013605544523788001\n }\n}\n```", "repo_url": "https://huggingface.co/AlekseyKorshuk/chatml-pyg-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|arc:challenge|25_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T21_00_54.207494", "path": ["**/details_harness|drop|3_2023-09-16T21-00-54.207494.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-16T21-00-54.207494.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T21_00_54.207494", "path": ["**/details_harness|gsm8k|5_2023-09-16T21-00-54.207494.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-16T21-00-54.207494.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hellaswag|10_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T19:38:34.758007.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T19:38:34.758007.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T19:38:34.758007.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T21_00_54.207494", "path": ["**/details_harness|winogrande|5_2023-09-16T21-00-54.207494.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-16T21-00-54.207494.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T19_38_34.758007", "path": ["results_2023-07-18T19:38:34.758007.parquet"]}, {"split": "2023_09_16T21_00_54.207494", "path": ["results_2023-09-16T21-00-54.207494.parquet"]}, {"split": "latest", "path": ["results_2023-09-16T21-00-54.207494.parquet"]}]}]}
|
2023-09-16T20:01:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of AlekseyKorshuk/chatml-pyg-v1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model AlekseyKorshuk/chatml-pyg-v1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-16T21:00:54.207494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of AlekseyKorshuk/chatml-pyg-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlekseyKorshuk/chatml-pyg-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T21:00:54.207494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AlekseyKorshuk/chatml-pyg-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlekseyKorshuk/chatml-pyg-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-16T21:00:54.207494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AlekseyKorshuk/chatml-pyg-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AlekseyKorshuk/chatml-pyg-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-16T21:00:54.207494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
61d636a33c30f9888326d3a4da92037ddb469551
|
# Dataset Card for Evaluation run of bigscience/bloom-1b1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigscience/bloom-1b1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigscience/bloom-1b1](https://huggingface.co/bigscience/bloom-1b1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloom-1b1",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T13:05:11.599988](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-1b1/blob/main/results_2023-12-04T13-05-11.599988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.002274450341167551,
"acc_stderr": 0.001312157814867416
},
"harness|gsm8k|5": {
"acc": 0.002274450341167551,
"acc_stderr": 0.001312157814867416
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bigscience__bloom-1b1
|
[
"region:us"
] |
2023-08-17T23:14:20+00:00
|
{"pretty_name": "Evaluation run of bigscience/bloom-1b1", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigscience/bloom-1b1](https://huggingface.co/bigscience/bloom-1b1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloom-1b1\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T13:05:11.599988](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-1b1/blob/main/results_2023-12-04T13-05-11.599988.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.001312157814867416\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.002274450341167551,\n \"acc_stderr\": 0.001312157814867416\n }\n}\n```", "repo_url": "https://huggingface.co/bigscience/bloom-1b1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_16T15_33_49.379039", "path": ["**/details_harness|drop|3_2023-10-16T15-33-49.379039.parquet"]}, {"split": "2023_10_18T03_48_56.783565", "path": ["**/details_harness|drop|3_2023-10-18T03-48-56.783565.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T03-48-56.783565.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_16T15_33_49.379039", "path": ["**/details_harness|gsm8k|5_2023-10-16T15-33-49.379039.parquet"]}, {"split": "2023_10_18T03_48_56.783565", "path": ["**/details_harness|gsm8k|5_2023-10-18T03-48-56.783565.parquet"]}, {"split": "2023_12_03T15_03_23.637549", "path": ["**/details_harness|gsm8k|5_2023-12-03T15-03-23.637549.parquet"]}, {"split": "2023_12_03T16_05_47.069863", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-05-47.069863.parquet"]}, {"split": "2023_12_03T16_07_04.058060", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-07-04.058060.parquet"]}, {"split": "2023_12_04T09_54_40.951209", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-54-40.951209.parquet"]}, {"split": "2023_12_04T09_55_47.157796", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-55-47.157796.parquet"]}, {"split": "2023_12_04T13_04_41.368838", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-04-41.368838.parquet"]}, {"split": "2023_12_04T13_05_11.599988", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-05-11.599988.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-05-11.599988.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:28.133292.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:50:28.133292.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:50:28.133292.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_16T15_33_49.379039", "path": ["**/details_harness|winogrande|5_2023-10-16T15-33-49.379039.parquet"]}, {"split": "2023_10_18T03_48_56.783565", "path": ["**/details_harness|winogrande|5_2023-10-18T03-48-56.783565.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T03-48-56.783565.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T09_50_28.133292", "path": ["results_2023-08-09T09:50:28.133292.parquet"]}, {"split": "2023_10_16T15_33_49.379039", "path": ["results_2023-10-16T15-33-49.379039.parquet"]}, {"split": "2023_10_18T03_48_56.783565", "path": ["results_2023-10-18T03-48-56.783565.parquet"]}, {"split": "2023_12_03T15_03_23.637549", "path": ["results_2023-12-03T15-03-23.637549.parquet"]}, {"split": "2023_12_03T16_05_47.069863", "path": ["results_2023-12-03T16-05-47.069863.parquet"]}, {"split": "2023_12_03T16_07_04.058060", "path": ["results_2023-12-03T16-07-04.058060.parquet"]}, {"split": "2023_12_04T09_54_40.951209", "path": ["results_2023-12-04T09-54-40.951209.parquet"]}, {"split": "2023_12_04T09_55_47.157796", "path": ["results_2023-12-04T09-55-47.157796.parquet"]}, {"split": "2023_12_04T13_04_41.368838", "path": ["results_2023-12-04T13-04-41.368838.parquet"]}, {"split": "2023_12_04T13_05_11.599988", "path": ["results_2023-12-04T13-05-11.599988.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T13-05-11.599988.parquet"]}]}]}
|
2023-12-04T13:05:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bigscience/bloom-1b1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bigscience/bloom-1b1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T13:05:11.599988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bigscience/bloom-1b1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-1b1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T13:05:11.599988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigscience/bloom-1b1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-1b1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T13:05:11.599988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bigscience/bloom-1b1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-1b1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T13:05:11.599988(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ba2d18818b9fc9d9ffcb09c2be3c9fe98185c1a0
|
# Dataset Card for Evaluation run of bigscience/bloom-560m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigscience/bloom-560m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 13 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloom-560m",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T13:05:03.033636](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-560m/blob/main/results_2023-12-04T13-05-03.033636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245468
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245468
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bigscience__bloom-560m
|
[
"region:us"
] |
2023-08-17T23:14:29+00:00
|
{"pretty_name": "Evaluation run of bigscience/bloom-560m", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 13 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloom-560m\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T13:05:03.033636](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-560m/blob/main/results_2023-12-04T13-05-03.033636.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245468\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \"acc_stderr\": 0.0015145735612245468\n }\n}\n```", "repo_url": "https://huggingface.co/bigscience/bloom-560m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T01_44_51.787860", "path": ["**/details_harness|drop|3_2023-10-17T01-44-51.787860.parquet"]}, {"split": "2023_10_19T07_58_25.532907", "path": ["**/details_harness|drop|3_2023-10-19T07-58-25.532907.parquet"]}, {"split": "2023_10_19T11_57_26.532188", "path": ["**/details_harness|drop|3_2023-10-19T11-57-26.532188.parquet"]}, {"split": "2023_10_19T13_58_30.472160", "path": ["**/details_harness|drop|3_2023-10-19T13-58-30.472160.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T13-58-30.472160.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T01_44_51.787860", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-44-51.787860.parquet"]}, {"split": "2023_10_19T07_58_25.532907", "path": ["**/details_harness|gsm8k|5_2023-10-19T07-58-25.532907.parquet"]}, {"split": "2023_10_19T11_57_26.532188", "path": ["**/details_harness|gsm8k|5_2023-10-19T11-57-26.532188.parquet"]}, {"split": "2023_10_19T13_58_30.472160", "path": ["**/details_harness|gsm8k|5_2023-10-19T13-58-30.472160.parquet"]}, {"split": "2023_12_03T15_01_55.935382", "path": ["**/details_harness|gsm8k|5_2023-12-03T15-01-55.935382.parquet"]}, {"split": "2023_12_03T15_02_09.067243", "path": ["**/details_harness|gsm8k|5_2023-12-03T15-02-09.067243.parquet"]}, {"split": "2023_12_03T16_04_42.088670", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-04-42.088670.parquet"]}, {"split": "2023_12_03T16_05_29.861058", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-05-29.861058.parquet"]}, {"split": "2023_12_04T09_54_26.106896", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-54-26.106896.parquet"]}, {"split": "2023_12_04T09_54_41.464190", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-54-41.464190.parquet"]}, {"split": "2023_12_04T13_04_03.136528", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-04-03.136528.parquet"]}, {"split": "2023_12_04T13_05_03.033636", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-05-03.033636.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-05-03.033636.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:50:46.994927.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:50:46.994927.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T01_44_51.787860", "path": ["**/details_harness|winogrande|5_2023-10-17T01-44-51.787860.parquet"]}, {"split": "2023_10_19T07_58_25.532907", "path": ["**/details_harness|winogrande|5_2023-10-19T07-58-25.532907.parquet"]}, {"split": "2023_10_19T11_57_26.532188", "path": ["**/details_harness|winogrande|5_2023-10-19T11-57-26.532188.parquet"]}, {"split": "2023_10_19T13_58_30.472160", "path": ["**/details_harness|winogrande|5_2023-10-19T13-58-30.472160.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T13-58-30.472160.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T09_50_46.994927", "path": ["results_2023-08-09T09:50:46.994927.parquet"]}, {"split": "2023_10_17T01_44_51.787860", "path": ["results_2023-10-17T01-44-51.787860.parquet"]}, {"split": "2023_10_19T07_58_25.532907", "path": ["results_2023-10-19T07-58-25.532907.parquet"]}, {"split": "2023_10_19T11_57_26.532188", "path": ["results_2023-10-19T11-57-26.532188.parquet"]}, {"split": "2023_10_19T13_58_30.472160", "path": ["results_2023-10-19T13-58-30.472160.parquet"]}, {"split": "2023_12_03T15_01_55.935382", "path": ["results_2023-12-03T15-01-55.935382.parquet"]}, {"split": "2023_12_03T15_02_09.067243", "path": ["results_2023-12-03T15-02-09.067243.parquet"]}, {"split": "2023_12_03T16_04_42.088670", "path": ["results_2023-12-03T16-04-42.088670.parquet"]}, {"split": "2023_12_03T16_05_29.861058", "path": ["results_2023-12-03T16-05-29.861058.parquet"]}, {"split": "2023_12_04T09_54_26.106896", "path": ["results_2023-12-04T09-54-26.106896.parquet"]}, {"split": "2023_12_04T09_54_41.464190", "path": ["results_2023-12-04T09-54-41.464190.parquet"]}, {"split": "2023_12_04T13_04_03.136528", "path": ["results_2023-12-04T13-04-03.136528.parquet"]}, {"split": "2023_12_04T13_05_03.033636", "path": ["results_2023-12-04T13-05-03.033636.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T13-05-03.033636.parquet"]}]}]}
|
2023-12-04T13:05:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bigscience/bloom-560m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bigscience/bloom-560m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 13 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T13:05:03.033636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bigscience/bloom-560m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-560m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 13 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T13:05:03.033636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigscience/bloom-560m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-560m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 13 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T13:05:03.033636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bigscience/bloom-560m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-560m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 13 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T13:05:03.033636(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b942710bb6cfb060186be06495b4fd82e86eb2d1
|
# Dataset Card for Evaluation run of bigscience/bloom-7b1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bigscience/bloom-7b1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [bigscience/bloom-7b1](https://huggingface.co/bigscience/bloom-7b1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bigscience__bloom-7b1",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T13:10:02.911977](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-7b1/blob/main/results_2023-12-04T13-10-02.911977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754807806
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754807806
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_bigscience__bloom-7b1
|
[
"region:us"
] |
2023-08-17T23:14:38+00:00
|
{"pretty_name": "Evaluation run of bigscience/bloom-7b1", "dataset_summary": "Dataset automatically created during the evaluation run of model [bigscience/bloom-7b1](https://huggingface.co/bigscience/bloom-7b1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bigscience__bloom-7b1\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-04T13:10:02.911977](https://huggingface.co/datasets/open-llm-leaderboard/details_bigscience__bloom-7b1/blob/main/results_2023-12-04T13-10-02.911977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.0031957470754807806\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.0031957470754807806\n }\n}\n```", "repo_url": "https://huggingface.co/bigscience/bloom-7b1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T22_39_12.950006", "path": ["**/details_harness|drop|3_2023-10-17T22-39-12.950006.parquet"]}, {"split": "2023_10_19T04_28_54.166367", "path": ["**/details_harness|drop|3_2023-10-19T04-28-54.166367.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T04-28-54.166367.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T22_39_12.950006", "path": ["**/details_harness|gsm8k|5_2023-10-17T22-39-12.950006.parquet"]}, {"split": "2023_10_19T04_28_54.166367", "path": ["**/details_harness|gsm8k|5_2023-10-19T04-28-54.166367.parquet"]}, {"split": "2023_12_03T15_05_35.973526", "path": ["**/details_harness|gsm8k|5_2023-12-03T15-05-35.973526.parquet"]}, {"split": "2023_12_03T16_09_00.043447", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-09-00.043447.parquet"]}, {"split": "2023_12_03T16_09_29.945299", "path": ["**/details_harness|gsm8k|5_2023-12-03T16-09-29.945299.parquet"]}, {"split": "2023_12_04T09_58_57.284123", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-58-57.284123.parquet"]}, {"split": "2023_12_04T09_59_23.998430", "path": ["**/details_harness|gsm8k|5_2023-12-04T09-59-23.998430.parquet"]}, {"split": "2023_12_04T13_09_04.314120", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-09-04.314120.parquet"]}, {"split": "2023_12_04T13_10_02.911977", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-10-02.911977.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-04T13-10-02.911977.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-01T14:42:42.953249.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:42:42.953249.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-01T14:42:42.953249.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T22_39_12.950006", "path": ["**/details_harness|winogrande|5_2023-10-17T22-39-12.950006.parquet"]}, {"split": "2023_10_19T04_28_54.166367", "path": ["**/details_harness|winogrande|5_2023-10-19T04-28-54.166367.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T04-28-54.166367.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_01T14_42_42.953249", "path": ["results_2023-08-01T14:42:42.953249.parquet"]}, {"split": "2023_10_17T22_39_12.950006", "path": ["results_2023-10-17T22-39-12.950006.parquet"]}, {"split": "2023_10_19T04_28_54.166367", "path": ["results_2023-10-19T04-28-54.166367.parquet"]}, {"split": "2023_12_03T15_05_35.973526", "path": ["results_2023-12-03T15-05-35.973526.parquet"]}, {"split": "2023_12_03T16_09_00.043447", "path": ["results_2023-12-03T16-09-00.043447.parquet"]}, {"split": "2023_12_03T16_09_29.945299", "path": ["results_2023-12-03T16-09-29.945299.parquet"]}, {"split": "2023_12_04T09_58_57.284123", "path": ["results_2023-12-04T09-58-57.284123.parquet"]}, {"split": "2023_12_04T09_59_23.998430", "path": ["results_2023-12-04T09-59-23.998430.parquet"]}, {"split": "2023_12_04T13_09_04.314120", "path": ["results_2023-12-04T13-09-04.314120.parquet"]}, {"split": "2023_12_04T13_10_02.911977", "path": ["results_2023-12-04T13-10-02.911977.parquet"]}, {"split": "latest", "path": ["results_2023-12-04T13-10-02.911977.parquet"]}]}]}
|
2023-12-04T13:10:08+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of bigscience/bloom-7b1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model bigscience/bloom-7b1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-04T13:10:02.911977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of bigscience/bloom-7b1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-7b1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T13:10:02.911977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of bigscience/bloom-7b1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-7b1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-04T13:10:02.911977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
166,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of bigscience/bloom-7b1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model bigscience/bloom-7b1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-04T13:10:02.911977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b32c53f9bcea65424ebd3a6e966dfa7cfefdc010
|
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-256m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/lamini-cerebras-256m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-256m](https://huggingface.co/MBZUAI/lamini-cerebras-256m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T21:23:58.159302](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m/blob/main/results_2023-10-18T21-23-58.159302.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.004614093959731544,
"em_stderr": 0.0006940305886353382,
"f1": 0.0485601929530202,
"f1_stderr": 0.001416776057030896,
"acc": 0.2600631412786109,
"acc_stderr": 0.007020548332172165
},
"harness|drop|3": {
"em": 0.004614093959731544,
"em_stderr": 0.0006940305886353382,
"f1": 0.0485601929530202,
"f1_stderr": 0.001416776057030896
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5201262825572218,
"acc_stderr": 0.01404109666434433
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m
|
[
"region:us"
] |
2023-08-17T23:14:47+00:00
|
{"pretty_name": "Evaluation run of MBZUAI/lamini-cerebras-256m", "dataset_summary": "Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-256m](https://huggingface.co/MBZUAI/lamini-cerebras-256m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T21:23:58.159302](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-256m/blob/main/results_2023-10-18T21-23-58.159302.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.004614093959731544,\n \"em_stderr\": 0.0006940305886353382,\n \"f1\": 0.0485601929530202,\n \"f1_stderr\": 0.001416776057030896,\n \"acc\": 0.2600631412786109,\n \"acc_stderr\": 0.007020548332172165\n },\n \"harness|drop|3\": {\n \"em\": 0.004614093959731544,\n \"em_stderr\": 0.0006940305886353382,\n \"f1\": 0.0485601929530202,\n \"f1_stderr\": 0.001416776057030896\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5201262825572218,\n \"acc_stderr\": 0.01404109666434433\n }\n}\n```", "repo_url": "https://huggingface.co/MBZUAI/lamini-cerebras-256m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T21_23_58.159302", "path": ["**/details_harness|drop|3_2023-10-18T21-23-58.159302.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T21-23-58.159302.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T21_23_58.159302", "path": ["**/details_harness|gsm8k|5_2023-10-18T21-23-58.159302.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T21-23-58.159302.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:03:54.782051.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:03:54.782051.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T21_23_58.159302", "path": ["**/details_harness|winogrande|5_2023-10-18T21-23-58.159302.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T21-23-58.159302.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_03_54.782051", "path": ["results_2023-07-19T14:03:54.782051.parquet"]}, {"split": "2023_10_18T21_23_58.159302", "path": ["results_2023-10-18T21-23-58.159302.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T21-23-58.159302.parquet"]}]}]}
|
2023-10-18T20:24:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-256m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-256m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T21:23:58.159302(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-256m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-256m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T21:23:58.159302(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-256m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-256m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T21:23:58.159302(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-256m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-256m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T21:23:58.159302(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b4fe5f8e4c41c8b2b500a623e368596fce27774f
|
# Dataset Card for Evaluation run of MBZUAI/lamini-neo-125m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/lamini-neo-125m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/lamini-neo-125m](https://huggingface.co/MBZUAI/lamini-neo-125m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-neo-125m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T21:31:39.486023](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-neo-125m/blob/main/results_2023-10-15T21-31-39.486023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511256,
"f1": 0.05480809563758396,
"f1_stderr": 0.0014349695374813807,
"acc": 0.2612470402525651,
"acc_stderr": 0.007019128912029945
},
"harness|drop|3": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511256,
"f1": 0.05480809563758396,
"f1_stderr": 0.0014349695374813807
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5224940805051302,
"acc_stderr": 0.01403825782405989
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MBZUAI__lamini-neo-125m
|
[
"region:us"
] |
2023-08-17T23:14:55+00:00
|
{"pretty_name": "Evaluation run of MBZUAI/lamini-neo-125m", "dataset_summary": "Dataset automatically created during the evaluation run of model [MBZUAI/lamini-neo-125m](https://huggingface.co/MBZUAI/lamini-neo-125m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-neo-125m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T21:31:39.486023](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-neo-125m/blob/main/results_2023-10-15T21-31-39.486023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511256,\n \"f1\": 0.05480809563758396,\n \"f1_stderr\": 0.0014349695374813807,\n \"acc\": 0.2612470402525651,\n \"acc_stderr\": 0.007019128912029945\n },\n \"harness|drop|3\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511256,\n \"f1\": 0.05480809563758396,\n \"f1_stderr\": 0.0014349695374813807\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5224940805051302,\n \"acc_stderr\": 0.01403825782405989\n }\n}\n```", "repo_url": "https://huggingface.co/MBZUAI/lamini-neo-125m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T21_31_39.486023", "path": ["**/details_harness|drop|3_2023-10-15T21-31-39.486023.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T21-31-39.486023.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T21_31_39.486023", "path": ["**/details_harness|gsm8k|5_2023-10-15T21-31-39.486023.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T21-31-39.486023.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:35.727802.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:58:35.727802.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:58:35.727802.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T21_31_39.486023", "path": ["**/details_harness|winogrande|5_2023-10-15T21-31-39.486023.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T21-31-39.486023.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_58_35.727802", "path": ["results_2023-07-19T13:58:35.727802.parquet"]}, {"split": "2023_10_15T21_31_39.486023", "path": ["results_2023-10-15T21-31-39.486023.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T21-31-39.486023.parquet"]}]}]}
|
2023-10-15T20:31:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MBZUAI/lamini-neo-125m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MBZUAI/lamini-neo-125m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T21:31:39.486023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MBZUAI/lamini-neo-125m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-neo-125m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T21:31:39.486023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MBZUAI/lamini-neo-125m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-neo-125m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T21:31:39.486023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MBZUAI/lamini-neo-125m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-neo-125m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T21:31:39.486023(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
5777c0eee21573e88d80be9ef3883166ccb1cfe3
|
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-1.3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/lamini-cerebras-1.3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-1.3b](https://huggingface.co/MBZUAI/lamini-cerebras-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-cerebras-1.3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T20:18:17.914204](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-1.3b/blob/main/results_2023-10-14T20-18-17.914204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.018141778523489933,
"em_stderr": 0.0013667968592600622,
"f1": 0.0807330117449663,
"f1_stderr": 0.0020354056846065625,
"acc": 0.25295974743488553,
"acc_stderr": 0.007025750419242903
},
"harness|drop|3": {
"em": 0.018141778523489933,
"em_stderr": 0.0013667968592600622,
"f1": 0.0807330117449663,
"f1_stderr": 0.0020354056846065625
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5059194948697711,
"acc_stderr": 0.014051500838485807
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MBZUAI__lamini-cerebras-1.3b
|
[
"region:us"
] |
2023-08-17T23:15:07+00:00
|
{"pretty_name": "Evaluation run of MBZUAI/lamini-cerebras-1.3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-1.3b](https://huggingface.co/MBZUAI/lamini-cerebras-1.3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-cerebras-1.3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T20:18:17.914204](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-1.3b/blob/main/results_2023-10-14T20-18-17.914204.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.018141778523489933,\n \"em_stderr\": 0.0013667968592600622,\n \"f1\": 0.0807330117449663,\n \"f1_stderr\": 0.0020354056846065625,\n \"acc\": 0.25295974743488553,\n \"acc_stderr\": 0.007025750419242903\n },\n \"harness|drop|3\": {\n \"em\": 0.018141778523489933,\n \"em_stderr\": 0.0013667968592600622,\n \"f1\": 0.0807330117449663,\n \"f1_stderr\": 0.0020354056846065625\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5059194948697711,\n \"acc_stderr\": 0.014051500838485807\n }\n}\n```", "repo_url": "https://huggingface.co/MBZUAI/lamini-cerebras-1.3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_14T20_18_17.914204", "path": ["**/details_harness|drop|3_2023-10-14T20-18-17.914204.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T20-18-17.914204.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_14T20_18_17.914204", "path": ["**/details_harness|gsm8k|5_2023-10-14T20-18-17.914204.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T20-18-17.914204.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:57:40.415603.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:57:40.415603.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:57:40.415603.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_14T20_18_17.914204", "path": ["**/details_harness|winogrande|5_2023-10-14T20-18-17.914204.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T20-18-17.914204.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_57_40.415603", "path": ["results_2023-07-19T14:57:40.415603.parquet"]}, {"split": "2023_10_14T20_18_17.914204", "path": ["results_2023-10-14T20-18-17.914204.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T20-18-17.914204.parquet"]}]}]}
|
2023-10-14T19:18:29+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-1.3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-1.3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-14T20:18:17.914204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-1.3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T20:18:17.914204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-1.3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T20:18:17.914204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-1.3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-1.3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T20:18:17.914204(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b08fe1fbb84f9117000e595bc08a2d5c223362f9
|
# Dataset Card for Evaluation run of MBZUAI/LaMini-GPT-1.5B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/LaMini-GPT-1.5B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/LaMini-GPT-1.5B](https://huggingface.co/MBZUAI/LaMini-GPT-1.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__LaMini-GPT-1.5B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T04:41:54.812526](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__LaMini-GPT-1.5B/blob/main/results_2023-10-18T04-41-54.812526.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.04383389261744967,
"em_stderr": 0.0020965795420484303,
"f1": 0.12851510067114089,
"f1_stderr": 0.00259878917049905,
"acc": 0.27940015785319655,
"acc_stderr": 0.006977487536417363
},
"harness|drop|3": {
"em": 0.04383389261744967,
"em_stderr": 0.0020965795420484303,
"f1": 0.12851510067114089,
"f1_stderr": 0.00259878917049905
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5588003157063931,
"acc_stderr": 0.013954975072834726
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MBZUAI__LaMini-GPT-1.5B
|
[
"region:us"
] |
2023-08-17T23:15:16+00:00
|
{"pretty_name": "Evaluation run of MBZUAI/LaMini-GPT-1.5B", "dataset_summary": "Dataset automatically created during the evaluation run of model [MBZUAI/LaMini-GPT-1.5B](https://huggingface.co/MBZUAI/LaMini-GPT-1.5B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__LaMini-GPT-1.5B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T04:41:54.812526](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__LaMini-GPT-1.5B/blob/main/results_2023-10-18T04-41-54.812526.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.04383389261744967,\n \"em_stderr\": 0.0020965795420484303,\n \"f1\": 0.12851510067114089,\n \"f1_stderr\": 0.00259878917049905,\n \"acc\": 0.27940015785319655,\n \"acc_stderr\": 0.006977487536417363\n },\n \"harness|drop|3\": {\n \"em\": 0.04383389261744967,\n \"em_stderr\": 0.0020965795420484303,\n \"f1\": 0.12851510067114089,\n \"f1_stderr\": 0.00259878917049905\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5588003157063931,\n \"acc_stderr\": 0.013954975072834726\n }\n}\n```", "repo_url": "https://huggingface.co/MBZUAI/LaMini-GPT-1.5B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T04_41_54.812526", "path": ["**/details_harness|drop|3_2023-10-18T04-41-54.812526.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T04-41-54.812526.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T04_41_54.812526", "path": ["**/details_harness|gsm8k|5_2023-10-18T04-41-54.812526.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T04-41-54.812526.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T15:24:24.241111.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:24:24.241111.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T15:24:24.241111.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T04_41_54.812526", "path": ["**/details_harness|winogrande|5_2023-10-18T04-41-54.812526.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T04-41-54.812526.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T15_24_24.241111", "path": ["results_2023-07-19T15:24:24.241111.parquet"]}, {"split": "2023_10_18T04_41_54.812526", "path": ["results_2023-10-18T04-41-54.812526.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T04-41-54.812526.parquet"]}]}]}
|
2023-10-18T03:42:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MBZUAI/LaMini-GPT-1.5B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MBZUAI/LaMini-GPT-1.5B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T04:41:54.812526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MBZUAI/LaMini-GPT-1.5B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/LaMini-GPT-1.5B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T04:41:54.812526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MBZUAI/LaMini-GPT-1.5B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/LaMini-GPT-1.5B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T04:41:54.812526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MBZUAI/LaMini-GPT-1.5B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/LaMini-GPT-1.5B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T04:41:54.812526(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9a7f5cc2a17986e5ae6ddf7b8ce59f6225cdecb7
|
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-111m
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/MBZUAI/lamini-cerebras-111m
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-111m](https://huggingface.co/MBZUAI/lamini-cerebras-111m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T18:05:40.911064](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m/blob/main/results_2023-10-18T18-05-40.911064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460529,
"f1": 0.02216757550335575,
"f1_stderr": 0.0009735143977020524,
"acc": 0.25611681136543013,
"acc_stderr": 0.007024139410202808
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196460529,
"f1": 0.02216757550335575,
"f1_stderr": 0.0009735143977020524
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5122336227308603,
"acc_stderr": 0.014048278820405616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m
|
[
"region:us"
] |
2023-08-17T23:15:24+00:00
|
{"pretty_name": "Evaluation run of MBZUAI/lamini-cerebras-111m", "dataset_summary": "Dataset automatically created during the evaluation run of model [MBZUAI/lamini-cerebras-111m](https://huggingface.co/MBZUAI/lamini-cerebras-111m) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T18:05:40.911064](https://huggingface.co/datasets/open-llm-leaderboard/details_MBZUAI__lamini-cerebras-111m/blob/main/results_2023-10-18T18-05-40.911064.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460529,\n \"f1\": 0.02216757550335575,\n \"f1_stderr\": 0.0009735143977020524,\n \"acc\": 0.25611681136543013,\n \"acc_stderr\": 0.007024139410202808\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196460529,\n \"f1\": 0.02216757550335575,\n \"f1_stderr\": 0.0009735143977020524\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5122336227308603,\n \"acc_stderr\": 0.014048278820405616\n }\n}\n```", "repo_url": "https://huggingface.co/MBZUAI/lamini-cerebras-111m", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T18_05_40.911064", "path": ["**/details_harness|drop|3_2023-10-18T18-05-40.911064.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T18-05-40.911064.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T18_05_40.911064", "path": ["**/details_harness|gsm8k|5_2023-10-18T18-05-40.911064.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T18-05-40.911064.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:45:36.693423.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T13:45:36.693423.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T18_05_40.911064", "path": ["**/details_harness|winogrande|5_2023-10-18T18-05-40.911064.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T18-05-40.911064.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T13_45_36.693423", "path": ["results_2023-07-19T13:45:36.693423.parquet"]}, {"split": "2023_10_18T18_05_40.911064", "path": ["results_2023-10-18T18-05-40.911064.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T18-05-40.911064.parquet"]}]}]}
|
2023-10-18T17:05:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-111m
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-111m on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T18:05:40.911064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-111m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-111m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T18:05:40.911064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-111m",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-111m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T18:05:40.911064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of MBZUAI/lamini-cerebras-111m## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model MBZUAI/lamini-cerebras-111m on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T18:05:40.911064(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
756349107db539c0dde4e69d1813861669aa07a8
|
# Dataset Card for Evaluation run of psmathur/model_42_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_42_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_42_70b](https://huggingface.co/psmathur/model_42_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_42_70b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T05:41:24.012842](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_42_70b/blob/main/results_2023-10-22T05-41-24.012842.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08095637583892618,
"em_stderr": 0.0027934007378494835,
"f1": 0.14366401006711405,
"f1_stderr": 0.0029514013565745323,
"acc": 0.591927346839615,
"acc_stderr": 0.011752297176210316
},
"harness|drop|3": {
"em": 0.08095637583892618,
"em_stderr": 0.0027934007378494835,
"f1": 0.14366401006711405,
"f1_stderr": 0.0029514013565745323
},
"harness|gsm8k|5": {
"acc": 0.34723275208491283,
"acc_stderr": 0.01311389838214687
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.01039069597027376
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__model_42_70b
|
[
"region:us"
] |
2023-08-17T23:15:34+00:00
|
{"pretty_name": "Evaluation run of psmathur/model_42_70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/model_42_70b](https://huggingface.co/psmathur/model_42_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_42_70b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T05:41:24.012842](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_42_70b/blob/main/results_2023-10-22T05-41-24.012842.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08095637583892618,\n \"em_stderr\": 0.0027934007378494835,\n \"f1\": 0.14366401006711405,\n \"f1_stderr\": 0.0029514013565745323,\n \"acc\": 0.591927346839615,\n \"acc_stderr\": 0.011752297176210316\n },\n \"harness|drop|3\": {\n \"em\": 0.08095637583892618,\n \"em_stderr\": 0.0027934007378494835,\n \"f1\": 0.14366401006711405,\n \"f1_stderr\": 0.0029514013565745323\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34723275208491283,\n \"acc_stderr\": 0.01311389838214687\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.01039069597027376\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/model_42_70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T05_41_24.012842", "path": ["**/details_harness|drop|3_2023-10-22T05-41-24.012842.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T05-41-24.012842.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T05_41_24.012842", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-41-24.012842.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T05-41-24.012842.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:07:45.652340.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:07:45.652340.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:07:45.652340.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T05_41_24.012842", "path": ["**/details_harness|winogrande|5_2023-10-22T05-41-24.012842.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T05-41-24.012842.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T19_07_45.652340", "path": ["results_2023-08-09T19:07:45.652340.parquet"]}, {"split": "2023_10_22T05_41_24.012842", "path": ["results_2023-10-22T05-41-24.012842.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T05-41-24.012842.parquet"]}]}]}
|
2023-10-22T04:41:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/model_42_70b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/model_42_70b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T05:41:24.012842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/model_42_70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_42_70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:41:24.012842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/model_42_70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_42_70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T05:41:24.012842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/model_42_70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_42_70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T05:41:24.012842(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
717eed04b15cc3bae4ffaa14584c4091f25802d9
|
# Dataset Card for Evaluation run of psmathur/model_007_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_007_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_007_13b](https://huggingface.co/psmathur/model_007_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_007_13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-11T11:34:56.294632](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_13b/blob/main/results_2023-08-11T11%3A34%3A56.294632.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2314240573187148,
"acc_stderr": 0.03071122006512167,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__model_007_13b
|
[
"region:us"
] |
2023-08-17T23:15:44+00:00
|
{"pretty_name": "Evaluation run of psmathur/model_007_13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/model_007_13b](https://huggingface.co/psmathur/model_007_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_007_13b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-08-11T11:34:56.294632](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_13b/blob/main/results_2023-08-11T11%3A34%3A56.294632.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2314240573187148,\n \"acc_stderr\": 0.03071122006512167,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/model_007_13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|arc:challenge|25_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|arc:challenge|25_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hellaswag|10_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hellaswag|10_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T13:37:17.110700.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-11T11:34:56.294632.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T13_37_17.110700", "path": ["results_2023-08-09T13:37:17.110700.parquet"]}, {"split": "2023_08_11T11_34_56.294632", "path": ["results_2023-08-11T11:34:56.294632.parquet"]}, {"split": "latest", "path": ["results_2023-08-11T11:34:56.294632.parquet"]}]}]}
|
2023-08-27T11:28:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/model_007_13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/model_007_13b on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-08-11T11:34:56.294632 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/model_007_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_007_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-11T11:34:56.294632 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/model_007_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_007_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-08-11T11:34:56.294632 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/model_007_13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_007_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-08-11T11:34:56.294632 (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8eeeef51152886277744d5299c4553b0002e852b
|
# Dataset Card for Evaluation run of psmathur/model_420
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_420
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_420](https://huggingface.co/psmathur/model_420) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_420",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T12:29:32.127683](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_420/blob/main/results_2023-10-25T12-29-32.127683.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.07770553691275167,
"em_stderr": 0.002741576916689869,
"f1": 0.1435245385906032,
"f1_stderr": 0.0028999685202973128,
"acc": 0.5616169002251712,
"acc_stderr": 0.01140770950597949
},
"harness|drop|3": {
"em": 0.07770553691275167,
"em_stderr": 0.002741576916689869,
"f1": 0.1435245385906032,
"f1_stderr": 0.0028999685202973128
},
"harness|gsm8k|5": {
"acc": 0.28582259287338896,
"acc_stderr": 0.01244496346061563
},
"harness|winogrande|5": {
"acc": 0.8374112075769534,
"acc_stderr": 0.010370455551343348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__model_420
|
[
"region:us"
] |
2023-08-17T23:15:55+00:00
|
{"pretty_name": "Evaluation run of psmathur/model_420", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/model_420](https://huggingface.co/psmathur/model_420) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_420\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T12:29:32.127683](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_420/blob/main/results_2023-10-25T12-29-32.127683.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.07770553691275167,\n \"em_stderr\": 0.002741576916689869,\n \"f1\": 0.1435245385906032,\n \"f1_stderr\": 0.0028999685202973128,\n \"acc\": 0.5616169002251712,\n \"acc_stderr\": 0.01140770950597949\n },\n \"harness|drop|3\": {\n \"em\": 0.07770553691275167,\n \"em_stderr\": 0.002741576916689869,\n \"f1\": 0.1435245385906032,\n \"f1_stderr\": 0.0028999685202973128\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.28582259287338896,\n \"acc_stderr\": 0.01244496346061563\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8374112075769534,\n \"acc_stderr\": 0.010370455551343348\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/model_420", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T12_29_32.127683", "path": ["**/details_harness|drop|3_2023-10-25T12-29-32.127683.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T12-29-32.127683.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T12_29_32.127683", "path": ["**/details_harness|gsm8k|5_2023-10-25T12-29-32.127683.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T12-29-32.127683.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:30:53.861982.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:30:53.861982.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T12_29_32.127683", "path": ["**/details_harness|winogrande|5_2023-10-25T12-29-32.127683.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T12-29-32.127683.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T21_30_53.861982", "path": ["results_2023-08-09T21:30:53.861982.parquet"]}, {"split": "2023_10_25T12_29_32.127683", "path": ["results_2023-10-25T12-29-32.127683.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T12-29-32.127683.parquet"]}]}]}
|
2023-10-25T11:29:44+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/model_420
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/model_420 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-25T12:29:32.127683(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/model_420",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_420 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T12:29:32.127683(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/model_420",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_420 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T12:29:32.127683(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
164,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/model_420## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_420 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T12:29:32.127683(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
7000b83aa05d5a43c277a073762f79d1ffe2689b
|
# Dataset Card for Evaluation run of psmathur/orca_mini_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_13b](https://huggingface.co/psmathur/orca_mini_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T13:38:44.745207](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_13b/blob/main/results_2023-10-15T13-38-44.745207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.03355704697986577,
"em_stderr": 0.001844249316229893,
"f1": 0.11233116610738275,
"f1_stderr": 0.002439557952450172,
"acc": 0.3208366219415943,
"acc_stderr": 0.006738290586283765
},
"harness|drop|3": {
"em": 0.03355704697986577,
"em_stderr": 0.001844249316229893,
"f1": 0.11233116610738275,
"f1_stderr": 0.002439557952450172
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6416732438831886,
"acc_stderr": 0.01347658117256753
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__orca_mini_13b
|
[
"region:us"
] |
2023-08-17T23:16:05+00:00
|
{"pretty_name": "Evaluation run of psmathur/orca_mini_13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/orca_mini_13b](https://huggingface.co/psmathur/orca_mini_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T13:38:44.745207](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_13b/blob/main/results_2023-10-15T13-38-44.745207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.03355704697986577,\n \"em_stderr\": 0.001844249316229893,\n \"f1\": 0.11233116610738275,\n \"f1_stderr\": 0.002439557952450172,\n \"acc\": 0.3208366219415943,\n \"acc_stderr\": 0.006738290586283765\n },\n \"harness|drop|3\": {\n \"em\": 0.03355704697986577,\n \"em_stderr\": 0.001844249316229893,\n \"f1\": 0.11233116610738275,\n \"f1_stderr\": 0.002439557952450172\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6416732438831886,\n \"acc_stderr\": 0.01347658117256753\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/orca_mini_13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T13_38_44.745207", "path": ["**/details_harness|drop|3_2023-10-15T13-38-44.745207.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T13-38-44.745207.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T13_38_44.745207", "path": ["**/details_harness|gsm8k|5_2023-10-15T13-38-44.745207.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T13-38-44.745207.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:53:33.020588.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:53:33.020588.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T13_38_44.745207", "path": ["**/details_harness|winogrande|5_2023-10-15T13-38-44.745207.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T13-38-44.745207.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T09_53_33.020588", "path": ["results_2023-08-09T09:53:33.020588.parquet"]}, {"split": "2023_10_15T13_38_44.745207", "path": ["results_2023-10-15T13-38-44.745207.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T13-38-44.745207.parquet"]}]}]}
|
2023-10-15T12:38:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/orca_mini_13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/orca_mini_13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T13:38:44.745207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/orca_mini_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T13:38:44.745207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/orca_mini_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T13:38:44.745207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/orca_mini_13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T13:38:44.745207(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3387fa9fb1456ee8c62d3f6a265c0d62791ae84d
|
# Dataset Card for Evaluation run of psmathur/orca_mini_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_7b](https://huggingface.co/psmathur/orca_mini_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T06:33:24.999563](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_7b/blob/main/results_2023-10-22T06-33-24.999563.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06910654362416108,
"em_stderr": 0.0025974621402952,
"f1": 0.14139786073825483,
"f1_stderr": 0.0029773237554709766,
"acc": 0.3322031890175344,
"acc_stderr": 0.007500207834545966
},
"harness|drop|3": {
"em": 0.06910654362416108,
"em_stderr": 0.0025974621402952,
"f1": 0.14139786073825483,
"f1_stderr": 0.0029773237554709766
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501828
},
"harness|winogrande|5": {
"acc": 0.6606156274664562,
"acc_stderr": 0.01330771492894175
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__orca_mini_7b
|
[
"region:us"
] |
2023-08-17T23:16:14+00:00
|
{"pretty_name": "Evaluation run of psmathur/orca_mini_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/orca_mini_7b](https://huggingface.co/psmathur/orca_mini_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T06:33:24.999563](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_7b/blob/main/results_2023-10-22T06-33-24.999563.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06910654362416108,\n \"em_stderr\": 0.0025974621402952,\n \"f1\": 0.14139786073825483,\n \"f1_stderr\": 0.0029773237554709766,\n \"acc\": 0.3322031890175344,\n \"acc_stderr\": 0.007500207834545966\n },\n \"harness|drop|3\": {\n \"em\": 0.06910654362416108,\n \"em_stderr\": 0.0025974621402952,\n \"f1\": 0.14139786073825483,\n \"f1_stderr\": 0.0029773237554709766\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501828\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6606156274664562,\n \"acc_stderr\": 0.01330771492894175\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/orca_mini_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T06_33_24.999563", "path": ["**/details_harness|drop|3_2023-10-22T06-33-24.999563.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T06-33-24.999563.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T06_33_24.999563", "path": ["**/details_harness|gsm8k|5_2023-10-22T06-33-24.999563.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T06-33-24.999563.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:32:16.099234.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:32:16.099234.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:32:16.099234.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T06_33_24.999563", "path": ["**/details_harness|winogrande|5_2023-10-22T06-33-24.999563.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T06-33-24.999563.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_32_16.099234", "path": ["results_2023-07-19T16:32:16.099234.parquet"]}, {"split": "2023_10_22T06_33_24.999563", "path": ["results_2023-10-22T06-33-24.999563.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T06-33-24.999563.parquet"]}]}]}
|
2023-10-22T05:33:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/orca_mini_7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/orca_mini_7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T06:33:24.999563(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/orca_mini_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T06:33:24.999563(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/orca_mini_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T06:33:24.999563(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/orca_mini_7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T06:33:24.999563(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
fbbfc5b6331a5ddb59a4e25b39ddf8a0cbc7cfad
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v2_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v2_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v2_7b](https://huggingface.co/psmathur/orca_mini_v2_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v2_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:49:31.845900](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_7b/blob/main/results_2023-09-22T15-49-31.845900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19305788590604026,
"em_stderr": 0.004042077305732669,
"f1": 0.2522955117449661,
"f1_stderr": 0.00407273200010099,
"acc": 0.371547709303585,
"acc_stderr": 0.008652008076903053
},
"harness|drop|3": {
"em": 0.19305788590604026,
"em_stderr": 0.004042077305732669,
"f1": 0.2522955117449661,
"f1_stderr": 0.00407273200010099
},
"harness|gsm8k|5": {
"acc": 0.02880970432145565,
"acc_stderr": 0.004607484283767487
},
"harness|winogrande|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.012696531870038616
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__orca_mini_v2_7b
|
[
"region:us"
] |
2023-08-17T23:16:23+00:00
|
{"pretty_name": "Evaluation run of psmathur/orca_mini_v2_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v2_7b](https://huggingface.co/psmathur/orca_mini_v2_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v2_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T15:49:31.845900](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_7b/blob/main/results_2023-09-22T15-49-31.845900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19305788590604026,\n \"em_stderr\": 0.004042077305732669,\n \"f1\": 0.2522955117449661,\n \"f1_stderr\": 0.00407273200010099,\n \"acc\": 0.371547709303585,\n \"acc_stderr\": 0.008652008076903053\n },\n \"harness|drop|3\": {\n \"em\": 0.19305788590604026,\n \"em_stderr\": 0.004042077305732669,\n \"f1\": 0.2522955117449661,\n \"f1_stderr\": 0.00407273200010099\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02880970432145565,\n \"acc_stderr\": 0.004607484283767487\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.012696531870038616\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/orca_mini_v2_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T15_49_31.845900", "path": ["**/details_harness|drop|3_2023-09-22T15-49-31.845900.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T15-49-31.845900.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T15_49_31.845900", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-49-31.845900.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T15-49-31.845900.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:55:35.342185.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T16:55:35.342185.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T15_49_31.845900", "path": ["**/details_harness|winogrande|5_2023-09-22T15-49-31.845900.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T15-49-31.845900.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T16_55_35.342185", "path": ["results_2023-07-19T16:55:35.342185.parquet"]}, {"split": "2023_09_22T15_49_31.845900", "path": ["results_2023-09-22T15-49-31.845900.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T15-49-31.845900.parquet"]}]}]}
|
2023-09-22T14:49:44+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v2_7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/orca_mini_v2_7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T15:49:31.845900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/orca_mini_v2_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v2_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T15:49:31.845900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/orca_mini_v2_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v2_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T15:49:31.845900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/orca_mini_v2_7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v2_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T15:49:31.845900(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
730a8e4b487d21e3079c386ac1cfd057c5df5a44
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v3_7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_7b](https://huggingface.co/psmathur/orca_mini_v3_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v3_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T04:27:15.231240](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_7b/blob/main/results_2023-10-18T04-27-15.231240.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672,
"acc": 0.4069827001752661,
"acc_stderr": 0.009686225873410097
},
"harness|drop|3": {
"em": 0.08043204697986577,
"em_stderr": 0.0027851341980506704,
"f1": 0.15059563758389252,
"f1_stderr": 0.0030534563383277672
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7426992896606156,
"acc_stderr": 0.012285989618865706
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__orca_mini_v3_7b
|
[
"region:us"
] |
2023-08-17T23:16:32+00:00
|
{"pretty_name": "Evaluation run of psmathur/orca_mini_v3_7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_7b](https://huggingface.co/psmathur/orca_mini_v3_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v3_7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T04:27:15.231240](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_7b/blob/main/results_2023-10-18T04-27-15.231240.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08043204697986577,\n \"em_stderr\": 0.0027851341980506704,\n \"f1\": 0.15059563758389252,\n \"f1_stderr\": 0.0030534563383277672,\n \"acc\": 0.4069827001752661,\n \"acc_stderr\": 0.009686225873410097\n },\n \"harness|drop|3\": {\n \"em\": 0.08043204697986577,\n \"em_stderr\": 0.0027851341980506704,\n \"f1\": 0.15059563758389252,\n \"f1_stderr\": 0.0030534563383277672\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7426992896606156,\n \"acc_stderr\": 0.012285989618865706\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/orca_mini_v3_7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|arc:challenge|25_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T04_27_15.231240", "path": ["**/details_harness|drop|3_2023-10-18T04-27-15.231240.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T04-27-15.231240.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T04_27_15.231240", "path": ["**/details_harness|gsm8k|5_2023-10-18T04-27-15.231240.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T04-27-15.231240.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hellaswag|10_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T13:35:32.670682.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-16T13:35:32.670682.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T04_27_15.231240", "path": ["**/details_harness|winogrande|5_2023-10-18T04-27-15.231240.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T04-27-15.231240.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_16T13_35_32.670682", "path": ["results_2023-08-16T13:35:32.670682.parquet"]}, {"split": "2023_10_18T04_27_15.231240", "path": ["results_2023-10-18T04-27-15.231240.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T04-27-15.231240.parquet"]}]}]}
|
2023-10-18T03:27:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/orca_mini_v3_7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T04:27:15.231240(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/orca_mini_v3_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v3_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T04:27:15.231240(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/orca_mini_v3_7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v3_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T04:27:15.231240(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/orca_mini_v3_7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v3_7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T04:27:15.231240(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2155d3cc8311989c62c7d13d8858b5fa005f170e
|
# Dataset Card for Evaluation run of psmathur/model_007
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_007
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_007](https://huggingface.co/psmathur/model_007) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_007_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T13:26:16.051201](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_public/blob/main/results_2023-11-09T13-26-16.051201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13276006711409397,
"em_stderr": 0.0034749056446198375,
"f1": 0.31045721476510313,
"f1_stderr": 0.003655086215890851,
"acc": 0.602479216693903,
"acc_stderr": 0.011890317786243781
},
"harness|drop|3": {
"em": 0.13276006711409397,
"em_stderr": 0.0034749056446198375,
"f1": 0.31045721476510313,
"f1_stderr": 0.003655086215890851
},
"harness|gsm8k|5": {
"acc": 0.37149355572403336,
"acc_stderr": 0.01330983907570648
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781083
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__model_007
|
[
"region:us"
] |
2023-08-17T23:16:51+00:00
|
{"pretty_name": "Evaluation run of psmathur/model_007", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/model_007](https://huggingface.co/psmathur/model_007) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_007_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-09T13:26:16.051201](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_007_public/blob/main/results_2023-11-09T13-26-16.051201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13276006711409397,\n \"em_stderr\": 0.0034749056446198375,\n \"f1\": 0.31045721476510313,\n \"f1_stderr\": 0.003655086215890851,\n \"acc\": 0.602479216693903,\n \"acc_stderr\": 0.011890317786243781\n },\n \"harness|drop|3\": {\n \"em\": 0.13276006711409397,\n \"em_stderr\": 0.0034749056446198375,\n \"f1\": 0.31045721476510313,\n \"f1_stderr\": 0.003655086215890851\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.37149355572403336,\n \"acc_stderr\": 0.01330983907570648\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781083\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/model_007", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_09T13_26_16.051201", "path": ["**/details_harness|drop|3_2023-11-09T13-26-16.051201.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-09T13-26-16.051201.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_09T13_26_16.051201", "path": ["**/details_harness|gsm8k|5_2023-11-09T13-26-16.051201.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-09T13-26-16.051201.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_09T13_26_16.051201", "path": ["**/details_harness|winogrande|5_2023-11-09T13-26-16.051201.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-09T13-26-16.051201.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_09T13_26_16.051201", "path": ["results_2023-11-09T13-26-16.051201.parquet"]}, {"split": "latest", "path": ["results_2023-11-09T13-26-16.051201.parquet"]}]}]}
|
2023-12-01T14:55:44+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/model_007
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/model_007 on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-09T13:26:16.051201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/model_007",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_007 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-09T13:26:16.051201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/model_007",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_007 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-09T13:26:16.051201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/model_007## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_007 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-09T13:26:16.051201(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c17a0ad12b933dbc691a7386dbac626f47c788fd
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v3_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_13b](https://huggingface.co/psmathur/orca_mini_v3_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v3_13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T15:47:49.456107](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_13b/blob/main/results_2023-10-18T15-47-49.456107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.15383808724832215,
"em_stderr": 0.0036948628598682874,
"f1": 0.22225880872483197,
"f1_stderr": 0.0037670501187578413,
"acc": 0.44797935342421163,
"acc_stderr": 0.010609253699619367
},
"harness|drop|3": {
"em": 0.15383808724832215,
"em_stderr": 0.0036948628598682874,
"f1": 0.22225880872483197,
"f1_stderr": 0.0037670501187578413
},
"harness|gsm8k|5": {
"acc": 0.13115996967399546,
"acc_stderr": 0.00929849923558785
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650884
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__orca_mini_v3_13b
|
[
"region:us"
] |
2023-08-17T23:17:00+00:00
|
{"pretty_name": "Evaluation run of psmathur/orca_mini_v3_13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v3_13b](https://huggingface.co/psmathur/orca_mini_v3_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v3_13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T15:47:49.456107](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v3_13b/blob/main/results_2023-10-18T15-47-49.456107.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.15383808724832215,\n \"em_stderr\": 0.0036948628598682874,\n \"f1\": 0.22225880872483197,\n \"f1_stderr\": 0.0037670501187578413,\n \"acc\": 0.44797935342421163,\n \"acc_stderr\": 0.010609253699619367\n },\n \"harness|drop|3\": {\n \"em\": 0.15383808724832215,\n \"em_stderr\": 0.0036948628598682874,\n \"f1\": 0.22225880872483197,\n \"f1_stderr\": 0.0037670501187578413\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13115996967399546,\n \"acc_stderr\": 0.00929849923558785\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650884\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/orca_mini_v3_13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T15_47_49.456107", "path": ["**/details_harness|drop|3_2023-10-18T15-47-49.456107.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T15-47-49.456107.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T15_47_49.456107", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-47-49.456107.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T15-47-49.456107.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:34:12.529590.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T21:34:12.529590.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T15_47_49.456107", "path": ["**/details_harness|winogrande|5_2023-10-18T15-47-49.456107.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T15-47-49.456107.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T21_34_12.529590", "path": ["results_2023-08-09T21:34:12.529590.parquet"]}, {"split": "2023_10_18T15_47_49.456107", "path": ["results_2023-10-18T15-47-49.456107.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T15-47-49.456107.parquet"]}]}]}
|
2023-10-18T14:48:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v3_13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/orca_mini_v3_13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T15:47:49.456107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/orca_mini_v3_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v3_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T15:47:49.456107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/orca_mini_v3_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v3_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T15:47:49.456107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/orca_mini_v3_13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v3_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T15:47:49.456107(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c1a0dd237d99e88b1b824d7df8e9c93f87cb9e7c
|
# Dataset Card for Evaluation run of psmathur/test_42_70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/test_42_70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/test_42_70b](https://huggingface.co/psmathur/test_42_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__test_42_70b_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-07T08:14:38.218715](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__test_42_70b_public/blob/main/results_2023-11-07T08-14-38.218715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08095637583892618,
"em_stderr": 0.0027934007378494835,
"f1": 0.14089450503355697,
"f1_stderr": 0.002922494704077647,
"acc": 0.6480304552550813,
"acc_stderr": 0.012058894490351774
},
"harness|drop|3": {
"em": 0.08095637583892618,
"em_stderr": 0.0027934007378494835,
"f1": 0.14089450503355697,
"f1_stderr": 0.002922494704077647
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.013727093010429786
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.01039069597027376
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__test_42_70b
|
[
"region:us"
] |
2023-08-17T23:17:09+00:00
|
{"pretty_name": "Evaluation run of psmathur/test_42_70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/test_42_70b](https://huggingface.co/psmathur/test_42_70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__test_42_70b_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-07T08:14:38.218715](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__test_42_70b_public/blob/main/results_2023-11-07T08-14-38.218715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08095637583892618,\n \"em_stderr\": 0.0027934007378494835,\n \"f1\": 0.14089450503355697,\n \"f1_stderr\": 0.002922494704077647,\n \"acc\": 0.6480304552550813,\n \"acc_stderr\": 0.012058894490351774\n },\n \"harness|drop|3\": {\n \"em\": 0.08095637583892618,\n \"em_stderr\": 0.0027934007378494835,\n \"f1\": 0.14089450503355697,\n \"f1_stderr\": 0.002922494704077647\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \"acc_stderr\": 0.013727093010429786\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.01039069597027376\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/test_42_70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_05T10_37_53.854467", "path": ["**/details_harness|drop|3_2023-11-05T10-37-53.854467.parquet"]}, {"split": "2023_11_07T08_14_38.218715", "path": ["**/details_harness|drop|3_2023-11-07T08-14-38.218715.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-07T08-14-38.218715.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_05T10_37_53.854467", "path": ["**/details_harness|gsm8k|5_2023-11-05T10-37-53.854467.parquet"]}, {"split": "2023_11_07T08_14_38.218715", "path": ["**/details_harness|gsm8k|5_2023-11-07T08-14-38.218715.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-07T08-14-38.218715.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_05T10_37_53.854467", "path": ["**/details_harness|winogrande|5_2023-11-05T10-37-53.854467.parquet"]}, {"split": "2023_11_07T08_14_38.218715", "path": ["**/details_harness|winogrande|5_2023-11-07T08-14-38.218715.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-07T08-14-38.218715.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_05T10_37_53.854467", "path": ["results_2023-11-05T10-37-53.854467.parquet"]}, {"split": "2023_11_07T08_14_38.218715", "path": ["results_2023-11-07T08-14-38.218715.parquet"]}, {"split": "latest", "path": ["results_2023-11-07T08-14-38.218715.parquet"]}]}]}
|
2023-12-01T14:32:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/test_42_70b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/test_42_70b on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-07T08:14:38.218715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/test_42_70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/test_42_70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-07T08:14:38.218715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/test_42_70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/test_42_70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-07T08:14:38.218715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/test_42_70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/test_42_70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-07T08:14:38.218715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
34bd2a731d308c3c797818d8647d6a8eab009c3a
|
# Dataset Card for Evaluation run of psmathur/model_51
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/model_51
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/model_51](https://huggingface.co/psmathur/model_51) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__model_51",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T14:06:44.247035](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_51/blob/main/results_2023-10-18T14-06-44.247035.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.4897231543624161,
"em_stderr": 0.005119386295274547,
"f1": 0.5842900587248353,
"f1_stderr": 0.004629096995502163,
"acc": 0.5707048282852822,
"acc_stderr": 0.011869906495819173
},
"harness|drop|3": {
"em": 0.4897231543624161,
"em_stderr": 0.005119386295274547,
"f1": 0.5842900587248353,
"f1_stderr": 0.004629096995502163
},
"harness|gsm8k|5": {
"acc": 0.3237300985595148,
"acc_stderr": 0.01288824739737114
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267205
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__model_51
|
[
"region:us"
] |
2023-08-17T23:17:19+00:00
|
{"pretty_name": "Evaluation run of psmathur/model_51", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/model_51](https://huggingface.co/psmathur/model_51) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__model_51\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T14:06:44.247035](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__model_51/blob/main/results_2023-10-18T14-06-44.247035.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.4897231543624161,\n \"em_stderr\": 0.005119386295274547,\n \"f1\": 0.5842900587248353,\n \"f1_stderr\": 0.004629096995502163,\n \"acc\": 0.5707048282852822,\n \"acc_stderr\": 0.011869906495819173\n },\n \"harness|drop|3\": {\n \"em\": 0.4897231543624161,\n \"em_stderr\": 0.005119386295274547,\n \"f1\": 0.5842900587248353,\n \"f1_stderr\": 0.004629096995502163\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3237300985595148,\n \"acc_stderr\": 0.01288824739737114\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267205\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/model_51", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|arc:challenge|25_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T14_06_44.247035", "path": ["**/details_harness|drop|3_2023-10-18T14-06-44.247035.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T14-06-44.247035.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T14_06_44.247035", "path": ["**/details_harness|gsm8k|5_2023-10-18T14-06-44.247035.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T14-06-44.247035.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hellaswag|10_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T16:28:12.692272.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T16:28:12.692272.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T16:28:12.692272.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T14_06_44.247035", "path": ["**/details_harness|winogrande|5_2023-10-18T14-06-44.247035.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T14-06-44.247035.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T16_28_12.692272", "path": ["results_2023-08-09T16:28:12.692272.parquet"]}, {"split": "2023_10_18T14_06_44.247035", "path": ["results_2023-10-18T14-06-44.247035.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T14-06-44.247035.parquet"]}]}]}
|
2023-10-18T13:06:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/model_51
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/model_51 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T14:06:44.247035(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/model_51",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_51 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T14:06:44.247035(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/model_51",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_51 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T14:06:44.247035(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
164,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/model_51## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/model_51 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T14:06:44.247035(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8d83d30b837299630860ab74873ffc2cc6cdde51
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v2_13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_v2_13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v2_13b](https://huggingface.co/psmathur/orca_mini_v2_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_v2_13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T06:53:23.116359](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_13b/blob/main/results_2023-09-23T06-53-23.116359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06040268456375839,
"em_stderr": 0.002439712523172895,
"f1": 0.14132655201342267,
"f1_stderr": 0.0028412840520175065,
"acc": 0.39529928978029205,
"acc_stderr": 0.009624116592337587
},
"harness|drop|3": {
"em": 0.06040268456375839,
"em_stderr": 0.002439712523172895,
"f1": 0.14132655201342267,
"f1_stderr": 0.0028412840520175065
},
"harness|gsm8k|5": {
"acc": 0.06368460955269144,
"acc_stderr": 0.00672621307880572
},
"harness|winogrande|5": {
"acc": 0.7269139700078927,
"acc_stderr": 0.012522020105869456
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__orca_mini_v2_13b
|
[
"region:us"
] |
2023-08-17T23:17:28+00:00
|
{"pretty_name": "Evaluation run of psmathur/orca_mini_v2_13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/orca_mini_v2_13b](https://huggingface.co/psmathur/orca_mini_v2_13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_v2_13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T06:53:23.116359](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_v2_13b/blob/main/results_2023-09-23T06-53-23.116359.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06040268456375839,\n \"em_stderr\": 0.002439712523172895,\n \"f1\": 0.14132655201342267,\n \"f1_stderr\": 0.0028412840520175065,\n \"acc\": 0.39529928978029205,\n \"acc_stderr\": 0.009624116592337587\n },\n \"harness|drop|3\": {\n \"em\": 0.06040268456375839,\n \"em_stderr\": 0.002439712523172895,\n \"f1\": 0.14132655201342267,\n \"f1_stderr\": 0.0028412840520175065\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06368460955269144,\n \"acc_stderr\": 0.00672621307880572\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869456\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/orca_mini_v2_13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|arc:challenge|25_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T06_53_23.116359", "path": ["**/details_harness|drop|3_2023-09-23T06-53-23.116359.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T06-53-23.116359.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T06_53_23.116359", "path": ["**/details_harness|gsm8k|5_2023-09-23T06-53-23.116359.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T06-53-23.116359.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hellaswag|10_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:41.797658.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:59:20.126364.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T09:59:20.126364.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T06_53_23.116359", "path": ["**/details_harness|winogrande|5_2023-09-23T06-53-23.116359.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T06-53-23.116359.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T19_28_41.797658", "path": ["results_2023-07-19T19:28:41.797658.parquet"]}, {"split": "2023_08_09T09_59_20.126364", "path": ["results_2023-08-09T09:59:20.126364.parquet"]}, {"split": "2023_09_23T06_53_23.116359", "path": ["results_2023-09-23T06-53-23.116359.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T06-53-23.116359.parquet"]}]}]}
|
2023-09-23T05:53:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/orca_mini_v2_13b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/orca_mini_v2_13b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T06:53:23.116359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/orca_mini_v2_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v2_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T06:53:23.116359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/orca_mini_v2_13b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v2_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T06:53:23.116359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/orca_mini_v2_13b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_v2_13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T06:53:23.116359(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c4692547d665cc1a9a9565ba0101bcea4bb67c3a
|
# Dataset Card for Evaluation run of psmathur/orca_mini_3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/psmathur/orca_mini_3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [psmathur/orca_mini_3b](https://huggingface.co/psmathur/orca_mini_3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_psmathur__orca_mini_3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T23:11:12.481568](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_3b/blob/main/results_2023-10-17T23-11-12.481568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08525587248322147,
"em_stderr": 0.0028599050719363664,
"f1": 0.14329697986577175,
"f1_stderr": 0.003074790501045602,
"acc": 0.3093767072589133,
"acc_stderr": 0.007206864164846476
},
"harness|drop|3": {
"em": 0.08525587248322147,
"em_stderr": 0.0028599050719363664,
"f1": 0.14329697986577175,
"f1_stderr": 0.003074790501045602
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225393
},
"harness|winogrande|5": {
"acc": 0.6179952644041041,
"acc_stderr": 0.013655578215970413
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_psmathur__orca_mini_3b
|
[
"region:us"
] |
2023-08-17T23:17:46+00:00
|
{"pretty_name": "Evaluation run of psmathur/orca_mini_3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [psmathur/orca_mini_3b](https://huggingface.co/psmathur/orca_mini_3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_psmathur__orca_mini_3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T23:11:12.481568](https://huggingface.co/datasets/open-llm-leaderboard/details_psmathur__orca_mini_3b/blob/main/results_2023-10-17T23-11-12.481568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08525587248322147,\n \"em_stderr\": 0.0028599050719363664,\n \"f1\": 0.14329697986577175,\n \"f1_stderr\": 0.003074790501045602,\n \"acc\": 0.3093767072589133,\n \"acc_stderr\": 0.007206864164846476\n },\n \"harness|drop|3\": {\n \"em\": 0.08525587248322147,\n \"em_stderr\": 0.0028599050719363664,\n \"f1\": 0.14329697986577175,\n \"f1_stderr\": 0.003074790501045602\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225393\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6179952644041041,\n \"acc_stderr\": 0.013655578215970413\n }\n}\n```", "repo_url": "https://huggingface.co/psmathur/orca_mini_3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T23_11_12.481568", "path": ["**/details_harness|drop|3_2023-10-17T23-11-12.481568.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T23-11-12.481568.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T23_11_12.481568", "path": ["**/details_harness|gsm8k|5_2023-10-17T23-11-12.481568.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T23-11-12.481568.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:31.628313.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:44:31.628313.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:44:31.628313.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T23_11_12.481568", "path": ["**/details_harness|winogrande|5_2023-10-17T23-11-12.481568.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T23-11-12.481568.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_44_31.628313", "path": ["results_2023-07-19T14:44:31.628313.parquet"]}, {"split": "2023_10_17T23_11_12.481568", "path": ["results_2023-10-17T23-11-12.481568.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T23-11-12.481568.parquet"]}]}]}
|
2023-10-17T22:11:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of psmathur/orca_mini_3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model psmathur/orca_mini_3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T23:11:12.481568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of psmathur/orca_mini_3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T23:11:12.481568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of psmathur/orca_mini_3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T23:11:12.481568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of psmathur/orca_mini_3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model psmathur/orca_mini_3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T23:11:12.481568(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f27b0569f8a32ba96f97905f8e73d6e88738910d
|
# Dataset Card for Evaluation run of TFLai/llama-13b-4bit-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/llama-13b-4bit-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/llama-13b-4bit-alpaca](https://huggingface.co/TFLai/llama-13b-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__llama-13b-4bit-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T09:29:09.659096](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__llama-13b-4bit-alpaca/blob/main/results_2023-10-22T09-29-09.659096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893279,
"f1": 0.05846371644295316,
"f1_stderr": 0.0013421113397736036,
"acc": 0.4025427050341287,
"acc_stderr": 0.009288639671179719
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893279,
"f1": 0.05846371644295316,
"f1_stderr": 0.0013421113397736036
},
"harness|gsm8k|5": {
"acc": 0.05686125852918878,
"acc_stderr": 0.006378790242099664
},
"harness|winogrande|5": {
"acc": 0.7482241515390686,
"acc_stderr": 0.012198489100259774
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TFLai__llama-13b-4bit-alpaca
|
[
"region:us"
] |
2023-08-17T23:17:55+00:00
|
{"pretty_name": "Evaluation run of TFLai/llama-13b-4bit-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [TFLai/llama-13b-4bit-alpaca](https://huggingface.co/TFLai/llama-13b-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__llama-13b-4bit-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T09:29:09.659096](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__llama-13b-4bit-alpaca/blob/main/results_2023-10-22T09-29-09.659096.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893279,\n \"f1\": 0.05846371644295316,\n \"f1_stderr\": 0.0013421113397736036,\n \"acc\": 0.4025427050341287,\n \"acc_stderr\": 0.009288639671179719\n },\n \"harness|drop|3\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893279,\n \"f1\": 0.05846371644295316,\n \"f1_stderr\": 0.0013421113397736036\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.05686125852918878,\n \"acc_stderr\": 0.006378790242099664\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7482241515390686,\n \"acc_stderr\": 0.012198489100259774\n }\n}\n```", "repo_url": "https://huggingface.co/TFLai/llama-13b-4bit-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|arc:challenge|25_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T09_29_09.659096", "path": ["**/details_harness|drop|3_2023-10-22T09-29-09.659096.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T09-29-09.659096.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T09_29_09.659096", "path": ["**/details_harness|gsm8k|5_2023-10-22T09-29-09.659096.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T09-29-09.659096.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hellaswag|10_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-08T22:41:19.681002.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-08T22:41:19.681002.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-08T22:41:19.681002.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T09_29_09.659096", "path": ["**/details_harness|winogrande|5_2023-10-22T09-29-09.659096.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T09-29-09.659096.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_08T22_41_19.681002", "path": ["results_2023-08-08T22:41:19.681002.parquet"]}, {"split": "2023_10_22T09_29_09.659096", "path": ["results_2023-10-22T09-29-09.659096.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T09-29-09.659096.parquet"]}]}]}
|
2023-10-22T08:29:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TFLai/llama-13b-4bit-alpaca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TFLai/llama-13b-4bit-alpaca on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T09:29:09.659096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TFLai/llama-13b-4bit-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-13b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T09:29:09.659096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TFLai/llama-13b-4bit-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-13b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T09:29:09.659096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TFLai/llama-13b-4bit-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-13b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T09:29:09.659096(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a0f7f5c121d15c81e9c3dfe4cdd20699085a15b0
|
# Dataset Card for Evaluation run of TFLai/gpt2-turkish-uncased
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/gpt2-turkish-uncased
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/gpt2-turkish-uncased](https://huggingface.co/TFLai/gpt2-turkish-uncased) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-02T15:29:40.186292](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased/blob/main/results_2023-12-02T15-29-40.186292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased
|
[
"region:us"
] |
2023-08-17T23:18:04+00:00
|
{"pretty_name": "Evaluation run of TFLai/gpt2-turkish-uncased", "dataset_summary": "Dataset automatically created during the evaluation run of model [TFLai/gpt2-turkish-uncased](https://huggingface.co/TFLai/gpt2-turkish-uncased) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-02T15:29:40.186292](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt2-turkish-uncased/blob/main/results_2023-12-02T15-29-40.186292.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/TFLai/gpt2-turkish-uncased", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T01_34_05.823968", "path": ["**/details_harness|drop|3_2023-10-22T01-34-05.823968.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T01-34-05.823968.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T01_34_05.823968", "path": ["**/details_harness|gsm8k|5_2023-10-22T01-34-05.823968.parquet"]}, {"split": "2023_12_02T15_29_40.186292", "path": ["**/details_harness|gsm8k|5_2023-12-02T15-29-40.186292.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-02T15-29-40.186292.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:48:46.264649.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-24T09:48:46.264649.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T01_34_05.823968", "path": ["**/details_harness|winogrande|5_2023-10-22T01-34-05.823968.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T01-34-05.823968.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_24T09_48_46.264649", "path": ["results_2023-07-24T09:48:46.264649.parquet"]}, {"split": "2023_10_22T01_34_05.823968", "path": ["results_2023-10-22T01-34-05.823968.parquet"]}, {"split": "2023_12_02T15_29_40.186292", "path": ["results_2023-12-02T15-29-40.186292.parquet"]}, {"split": "latest", "path": ["results_2023-12-02T15-29-40.186292.parquet"]}]}]}
|
2023-12-02T15:29:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TFLai/gpt2-turkish-uncased
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TFLai/gpt2-turkish-uncased on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-02T15:29:40.186292(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TFLai/gpt2-turkish-uncased",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/gpt2-turkish-uncased on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T15:29:40.186292(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TFLai/gpt2-turkish-uncased",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/gpt2-turkish-uncased on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-02T15:29:40.186292(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TFLai/gpt2-turkish-uncased## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/gpt2-turkish-uncased on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-02T15:29:40.186292(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b5ab8b2007bbeec6fd82fa88ba8bfcfa76e8b9cd
|
# Dataset Card for Evaluation run of TFLai/llama-7b-4bit-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/llama-7b-4bit-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/llama-7b-4bit-alpaca](https://huggingface.co/TFLai/llama-7b-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__llama-7b-4bit-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T15:51:21.649052](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__llama-7b-4bit-alpaca/blob/main/results_2023-09-17T15-51-21.649052.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652191393,
"f1": 0.05702286073825514,
"f1_stderr": 0.0013031105885826732,
"acc": 0.3718023208847917,
"acc_stderr": 0.008942653172749102
},
"harness|drop|3": {
"em": 0.0010486577181208054,
"em_stderr": 0.00033145814652191393,
"f1": 0.05702286073825514,
"f1_stderr": 0.0013031105885826732
},
"harness|gsm8k|5": {
"acc": 0.0356330553449583,
"acc_stderr": 0.005106107853744191
},
"harness|winogrande|5": {
"acc": 0.7079715864246251,
"acc_stderr": 0.012779198491754013
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TFLai__llama-7b-4bit-alpaca
|
[
"region:us"
] |
2023-08-17T23:18:13+00:00
|
{"pretty_name": "Evaluation run of TFLai/llama-7b-4bit-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [TFLai/llama-7b-4bit-alpaca](https://huggingface.co/TFLai/llama-7b-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__llama-7b-4bit-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T15:51:21.649052](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__llama-7b-4bit-alpaca/blob/main/results_2023-09-17T15-51-21.649052.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652191393,\n \"f1\": 0.05702286073825514,\n \"f1_stderr\": 0.0013031105885826732,\n \"acc\": 0.3718023208847917,\n \"acc_stderr\": 0.008942653172749102\n },\n \"harness|drop|3\": {\n \"em\": 0.0010486577181208054,\n \"em_stderr\": 0.00033145814652191393,\n \"f1\": 0.05702286073825514,\n \"f1_stderr\": 0.0013031105885826732\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0356330553449583,\n \"acc_stderr\": 0.005106107853744191\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7079715864246251,\n \"acc_stderr\": 0.012779198491754013\n }\n}\n```", "repo_url": "https://huggingface.co/TFLai/llama-7b-4bit-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T15_51_21.649052", "path": ["**/details_harness|drop|3_2023-09-17T15-51-21.649052.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T15-51-21.649052.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T15_51_21.649052", "path": ["**/details_harness|gsm8k|5_2023-09-17T15-51-21.649052.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T15-51-21.649052.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:29:56.361922.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:29:56.361922.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:29:56.361922.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T15_51_21.649052", "path": ["**/details_harness|winogrande|5_2023-09-17T15-51-21.649052.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T15-51-21.649052.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T19_29_56.361922", "path": ["results_2023-08-09T19:29:56.361922.parquet"]}, {"split": "2023_09_17T15_51_21.649052", "path": ["results_2023-09-17T15-51-21.649052.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T15-51-21.649052.parquet"]}]}]}
|
2023-09-17T14:51:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TFLai/llama-7b-4bit-alpaca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TFLai/llama-7b-4bit-alpaca on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T15:51:21.649052(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TFLai/llama-7b-4bit-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-7b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T15:51:21.649052(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TFLai/llama-7b-4bit-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-7b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T15:51:21.649052(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TFLai/llama-7b-4bit-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-7b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T15:51:21.649052(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8b58ef86091dea29aaabb15ae8044d66cacb717d
|
# Dataset Card for Evaluation run of TFLai/gpt-neox-20b-4bit-alpaca
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/gpt-neox-20b-4bit-alpaca
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/gpt-neox-20b-4bit-alpaca](https://huggingface.co/TFLai/gpt-neox-20b-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__gpt-neox-20b-4bit-alpaca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T00:29:49.222050](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt-neox-20b-4bit-alpaca/blob/main/results_2023-10-29T00-29-49.222050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003355704697986577,
"em_stderr": 0.0005922452850005421,
"f1": 0.056341233221476654,
"f1_stderr": 0.0014240670819378977,
"acc": 0.33007444471637587,
"acc_stderr": 0.008168323784666929
},
"harness|drop|3": {
"em": 0.003355704697986577,
"em_stderr": 0.0005922452850005421,
"f1": 0.056341233221476654,
"f1_stderr": 0.0014240670819378977
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.0029206661987887265
},
"harness|winogrande|5": {
"acc": 0.648776637726914,
"acc_stderr": 0.013415981370545131
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TFLai__gpt-neox-20b-4bit-alpaca
|
[
"region:us"
] |
2023-08-17T23:18:22+00:00
|
{"pretty_name": "Evaluation run of TFLai/gpt-neox-20b-4bit-alpaca", "dataset_summary": "Dataset automatically created during the evaluation run of model [TFLai/gpt-neox-20b-4bit-alpaca](https://huggingface.co/TFLai/gpt-neox-20b-4bit-alpaca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__gpt-neox-20b-4bit-alpaca\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T00:29:49.222050](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__gpt-neox-20b-4bit-alpaca/blob/main/results_2023-10-29T00-29-49.222050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003355704697986577,\n \"em_stderr\": 0.0005922452850005421,\n \"f1\": 0.056341233221476654,\n \"f1_stderr\": 0.0014240670819378977,\n \"acc\": 0.33007444471637587,\n \"acc_stderr\": 0.008168323784666929\n },\n \"harness|drop|3\": {\n \"em\": 0.003355704697986577,\n \"em_stderr\": 0.0005922452850005421,\n \"f1\": 0.056341233221476654,\n \"f1_stderr\": 0.0014240670819378977\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.0029206661987887265\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.648776637726914,\n \"acc_stderr\": 0.013415981370545131\n }\n}\n```", "repo_url": "https://huggingface.co/TFLai/gpt-neox-20b-4bit-alpaca", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|arc:challenge|25_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T00_29_49.222050", "path": ["**/details_harness|drop|3_2023-10-29T00-29-49.222050.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T00-29-49.222050.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T00_29_49.222050", "path": ["**/details_harness|gsm8k|5_2023-10-29T00-29-49.222050.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T00-29-49.222050.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hellaswag|10_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T02:03:16.907594.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T02:03:16.907594.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T02:03:16.907594.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T00_29_49.222050", "path": ["**/details_harness|winogrande|5_2023-10-29T00-29-49.222050.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T00-29-49.222050.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T02_03_16.907594", "path": ["results_2023-08-10T02:03:16.907594.parquet"]}, {"split": "2023_10_29T00_29_49.222050", "path": ["results_2023-10-29T00-29-49.222050.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T00-29-49.222050.parquet"]}]}]}
|
2023-10-28T23:30:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TFLai/gpt-neox-20b-4bit-alpaca
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TFLai/gpt-neox-20b-4bit-alpaca on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-29T00:29:49.222050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TFLai/gpt-neox-20b-4bit-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/gpt-neox-20b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T00:29:49.222050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TFLai/gpt-neox-20b-4bit-alpaca",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/gpt-neox-20b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T00:29:49.222050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TFLai/gpt-neox-20b-4bit-alpaca## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/gpt-neox-20b-4bit-alpaca on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T00:29:49.222050(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d5621747a4ad33e6c24bdd3af5c16cbbbbb98c6e
|
# Dataset Card for Evaluation run of TFLai/llama-2-13b-4bit-alpaca-gpt4
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/llama-2-13b-4bit-alpaca-gpt4
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/llama-2-13b-4bit-alpaca-gpt4](https://huggingface.co/TFLai/llama-2-13b-4bit-alpaca-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__llama-2-13b-4bit-alpaca-gpt4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T12:02:28.224092](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__llama-2-13b-4bit-alpaca-gpt4/blob/main/results_2023-09-23T12-02-28.224092.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135494192,
"f1": 0.06384752516778533,
"f1_stderr": 0.0014352043722984558,
"acc": 0.4170097889326838,
"acc_stderr": 0.00986570196677561
},
"harness|drop|3": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135494192,
"f1": 0.06384752516778533,
"f1_stderr": 0.0014352043722984558
},
"harness|gsm8k|5": {
"acc": 0.08263836239575435,
"acc_stderr": 0.007584089220148112
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403108
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TFLai__llama-2-13b-4bit-alpaca-gpt4
|
[
"region:us"
] |
2023-08-17T23:18:31+00:00
|
{"pretty_name": "Evaluation run of TFLai/llama-2-13b-4bit-alpaca-gpt4", "dataset_summary": "Dataset automatically created during the evaluation run of model [TFLai/llama-2-13b-4bit-alpaca-gpt4](https://huggingface.co/TFLai/llama-2-13b-4bit-alpaca-gpt4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__llama-2-13b-4bit-alpaca-gpt4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T12:02:28.224092](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__llama-2-13b-4bit-alpaca-gpt4/blob/main/results_2023-09-23T12-02-28.224092.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135494192,\n \"f1\": 0.06384752516778533,\n \"f1_stderr\": 0.0014352043722984558,\n \"acc\": 0.4170097889326838,\n \"acc_stderr\": 0.00986570196677561\n },\n \"harness|drop|3\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135494192,\n \"f1\": 0.06384752516778533,\n \"f1_stderr\": 0.0014352043722984558\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08263836239575435,\n \"acc_stderr\": 0.007584089220148112\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403108\n }\n}\n```", "repo_url": "https://huggingface.co/TFLai/llama-2-13b-4bit-alpaca-gpt4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|arc:challenge|25_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T12_02_28.224092", "path": ["**/details_harness|drop|3_2023-09-23T12-02-28.224092.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T12-02-28.224092.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T12_02_28.224092", "path": ["**/details_harness|gsm8k|5_2023-09-23T12-02-28.224092.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T12-02-28.224092.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hellaswag|10_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-10T01:35:37.376954.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T01:35:37.376954.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-10T01:35:37.376954.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T12_02_28.224092", "path": ["**/details_harness|winogrande|5_2023-09-23T12-02-28.224092.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T12-02-28.224092.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_10T01_35_37.376954", "path": ["results_2023-08-10T01:35:37.376954.parquet"]}, {"split": "2023_09_23T12_02_28.224092", "path": ["results_2023-09-23T12-02-28.224092.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T12-02-28.224092.parquet"]}]}]}
|
2023-09-23T11:02:40+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TFLai/llama-2-13b-4bit-alpaca-gpt4
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TFLai/llama-2-13b-4bit-alpaca-gpt4 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T12:02:28.224092(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TFLai/llama-2-13b-4bit-alpaca-gpt4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-2-13b-4bit-alpaca-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T12:02:28.224092(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TFLai/llama-2-13b-4bit-alpaca-gpt4",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-2-13b-4bit-alpaca-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T12:02:28.224092(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
27,
31,
175,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TFLai/llama-2-13b-4bit-alpaca-gpt4## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/llama-2-13b-4bit-alpaca-gpt4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T12:02:28.224092(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d6a7080deb242ff5e81f4737dab9be08b396e63e
|
# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Preview1-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/OpenOrca-Preview1-13B](https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__OpenOrca-Preview1-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-14T20:21:09.766845](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__OpenOrca-Preview1-13B/blob/main/results_2023-10-14T20-21-09.766845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.29068791946308725,
"em_stderr": 0.0046502020293331045,
"f1": 0.3704488255033573,
"f1_stderr": 0.004551116258249772,
"acc": 0.37980957088224854,
"acc_stderr": 0.009355350731725371
},
"harness|drop|3": {
"em": 0.29068791946308725,
"em_stderr": 0.0046502020293331045,
"f1": 0.3704488255033573,
"f1_stderr": 0.004551116258249772
},
"harness|gsm8k|5": {
"acc": 0.04927975739196361,
"acc_stderr": 0.005962150655812475
},
"harness|winogrande|5": {
"acc": 0.7103393843725335,
"acc_stderr": 0.012748550807638268
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Open-Orca__OpenOrca-Preview1-13B
|
[
"region:us"
] |
2023-08-17T23:18:40+00:00
|
{"pretty_name": "Evaluation run of Open-Orca/OpenOrca-Preview1-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Open-Orca/OpenOrca-Preview1-13B](https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__OpenOrca-Preview1-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-14T20:21:09.766845](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__OpenOrca-Preview1-13B/blob/main/results_2023-10-14T20-21-09.766845.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.29068791946308725,\n \"em_stderr\": 0.0046502020293331045,\n \"f1\": 0.3704488255033573,\n \"f1_stderr\": 0.004551116258249772,\n \"acc\": 0.37980957088224854,\n \"acc_stderr\": 0.009355350731725371\n },\n \"harness|drop|3\": {\n \"em\": 0.29068791946308725,\n \"em_stderr\": 0.0046502020293331045,\n \"f1\": 0.3704488255033573,\n \"f1_stderr\": 0.004551116258249772\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04927975739196361,\n \"acc_stderr\": 0.005962150655812475\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7103393843725335,\n \"acc_stderr\": 0.012748550807638268\n }\n}\n```", "repo_url": "https://huggingface.co/Open-Orca/OpenOrca-Preview1-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|arc:challenge|25_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_16T21_46_22.998704", "path": ["**/details_harness|drop|3_2023-09-16T21-46-22.998704.parquet"]}, {"split": "2023_10_14T20_21_09.766845", "path": ["**/details_harness|drop|3_2023-10-14T20-21-09.766845.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-14T20-21-09.766845.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_16T21_46_22.998704", "path": ["**/details_harness|gsm8k|5_2023-09-16T21-46-22.998704.parquet"]}, {"split": "2023_10_14T20_21_09.766845", "path": ["**/details_harness|gsm8k|5_2023-10-14T20-21-09.766845.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-14T20-21-09.766845.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hellaswag|10_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T18:45:02.691094.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-25T10:13:37.099822.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:13:37.099822.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-25T10:13:37.099822.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_16T21_46_22.998704", "path": ["**/details_harness|winogrande|5_2023-09-16T21-46-22.998704.parquet"]}, {"split": "2023_10_14T20_21_09.766845", "path": ["**/details_harness|winogrande|5_2023-10-14T20-21-09.766845.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-14T20-21-09.766845.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T18_45_02.691094", "path": ["results_2023-07-19T18:45:02.691094.parquet"]}, {"split": "2023_07_25T10_13_37.099822", "path": ["results_2023-07-25T10:13:37.099822.parquet"]}, {"split": "2023_09_16T21_46_22.998704", "path": ["results_2023-09-16T21-46-22.998704.parquet"]}, {"split": "2023_10_14T20_21_09.766845", "path": ["results_2023-10-14T20-21-09.766845.parquet"]}, {"split": "latest", "path": ["results_2023-10-14T20-21-09.766845.parquet"]}]}]}
|
2023-10-14T19:21:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Preview1-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Preview1-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-14T20:21:09.766845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Preview1-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Preview1-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T20:21:09.766845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Preview1-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Preview1-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-14T20:21:09.766845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Open-Orca/OpenOrca-Preview1-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrca-Preview1-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-14T20:21:09.766845(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
06e062386aba5764269fdce493d586af2b39576e
|
# Dataset Card for Evaluation run of Open-Orca/OpenOrcaxOpenChat-Preview2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Open-Orca/OpenOrcaxOpenChat-Preview2-13B](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 7 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Open-Orca__OpenOrcaxOpenChat-Preview2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T12:28:11.016345](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__OpenOrcaxOpenChat-Preview2-13B/blob/main/results_2023-10-16T12-28-11.016345.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.005662751677852349,
"em_stderr": 0.0007684582267637388,
"f1": 0.08682151845637606,
"f1_stderr": 0.0017946969581043605,
"acc": 0.4645440657550116,
"acc_stderr": 0.01076755669090175
},
"harness|drop|3": {
"em": 0.005662751677852349,
"em_stderr": 0.0007684582267637388,
"f1": 0.08682151845637606,
"f1_stderr": 0.0017946969581043605
},
"harness|gsm8k|5": {
"acc": 0.1508718726307809,
"acc_stderr": 0.009859004137305687
},
"harness|winogrande|5": {
"acc": 0.7782162588792423,
"acc_stderr": 0.011676109244497813
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Open-Orca__OpenOrcaxOpenChat-Preview2-13B
|
[
"region:us"
] |
2023-08-17T23:18:57+00:00
|
{"pretty_name": "Evaluation run of Open-Orca/OpenOrcaxOpenChat-Preview2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Open-Orca/OpenOrcaxOpenChat-Preview2-13B](https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 7 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Open-Orca__OpenOrcaxOpenChat-Preview2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-16T12:28:11.016345](https://huggingface.co/datasets/open-llm-leaderboard/details_Open-Orca__OpenOrcaxOpenChat-Preview2-13B/blob/main/results_2023-10-16T12-28-11.016345.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005662751677852349,\n \"em_stderr\": 0.0007684582267637388,\n \"f1\": 0.08682151845637606,\n \"f1_stderr\": 0.0017946969581043605,\n \"acc\": 0.4645440657550116,\n \"acc_stderr\": 0.01076755669090175\n },\n \"harness|drop|3\": {\n \"em\": 0.005662751677852349,\n \"em_stderr\": 0.0007684582267637388,\n \"f1\": 0.08682151845637606,\n \"f1_stderr\": 0.0017946969581043605\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1508718726307809,\n \"acc_stderr\": 0.009859004137305687\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7782162588792423,\n \"acc_stderr\": 0.011676109244497813\n }\n}\n```", "repo_url": "https://huggingface.co/Open-Orca/OpenOrcaxOpenChat-Preview2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|arc:challenge|25_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|arc:challenge|25_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T22_24_47.406808", "path": ["**/details_harness|drop|3_2023-09-17T22-24-47.406808.parquet"]}, {"split": "2023_10_13T14_08_03.258972", "path": ["**/details_harness|drop|3_2023-10-13T14-08-03.258972.parquet"]}, {"split": "2023_10_14T08_18_52.745685", "path": ["**/details_harness|drop|3_2023-10-14T08-18-52.745685.parquet"]}, {"split": "2023_10_16T12_28_11.016345", "path": ["**/details_harness|drop|3_2023-10-16T12-28-11.016345.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-16T12-28-11.016345.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T22_24_47.406808", "path": ["**/details_harness|gsm8k|5_2023-09-17T22-24-47.406808.parquet"]}, {"split": "2023_10_13T14_08_03.258972", "path": ["**/details_harness|gsm8k|5_2023-10-13T14-08-03.258972.parquet"]}, {"split": "2023_10_14T08_18_52.745685", "path": ["**/details_harness|gsm8k|5_2023-10-14T08-18-52.745685.parquet"]}, {"split": "2023_10_16T12_28_11.016345", "path": ["**/details_harness|gsm8k|5_2023-10-16T12-28-11.016345.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-16T12-28-11.016345.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hellaswag|10_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hellaswag|10_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T10:54:28.159442.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T11:01:47.680717.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:44.921082.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:53:44.921082.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-09T19:53:44.921082.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T22_24_47.406808", "path": ["**/details_harness|winogrande|5_2023-09-17T22-24-47.406808.parquet"]}, {"split": "2023_10_13T14_08_03.258972", "path": ["**/details_harness|winogrande|5_2023-10-13T14-08-03.258972.parquet"]}, {"split": "2023_10_14T08_18_52.745685", "path": ["**/details_harness|winogrande|5_2023-10-14T08-18-52.745685.parquet"]}, {"split": "2023_10_16T12_28_11.016345", "path": ["**/details_harness|winogrande|5_2023-10-16T12-28-11.016345.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-16T12-28-11.016345.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_08_09T10_54_28.159442", "path": ["results_2023-08-09T10:54:28.159442.parquet"]}, {"split": "2023_08_09T11_01_47.680717", "path": ["results_2023-08-09T11:01:47.680717.parquet"]}, {"split": "2023_08_09T19_53_44.921082", "path": ["results_2023-08-09T19:53:44.921082.parquet"]}, {"split": "2023_09_17T22_24_47.406808", "path": ["results_2023-09-17T22-24-47.406808.parquet"]}, {"split": "2023_10_13T14_08_03.258972", "path": ["results_2023-10-13T14-08-03.258972.parquet"]}, {"split": "2023_10_14T08_18_52.745685", "path": ["results_2023-10-14T08-18-52.745685.parquet"]}, {"split": "2023_10_16T12_28_11.016345", "path": ["results_2023-10-16T12-28-11.016345.parquet"]}, {"split": "latest", "path": ["results_2023-10-16T12-28-11.016345.parquet"]}]}]}
|
2023-10-16T11:28:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Open-Orca/OpenOrcaxOpenChat-Preview2-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Open-Orca/OpenOrcaxOpenChat-Preview2-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 7 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-16T12:28:11.016345(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Open-Orca/OpenOrcaxOpenChat-Preview2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrcaxOpenChat-Preview2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 7 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T12:28:11.016345(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Open-Orca/OpenOrcaxOpenChat-Preview2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrcaxOpenChat-Preview2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 7 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-16T12:28:11.016345(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Open-Orca/OpenOrcaxOpenChat-Preview2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Open-Orca/OpenOrcaxOpenChat-Preview2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 7 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-16T12:28:11.016345(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b65a2eeda9c60d0c69955d4993ac41b338ee99a6
|
# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/stablelm-tuned-alpha-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-tuned-alpha-3b](https://huggingface.co/stabilityai/stablelm-tuned-alpha-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T21:41:17.455218](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-3b/blob/main/results_2023-10-15T21-41-17.455218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511154,
"f1": 0.05061136744966456,
"f1_stderr": 0.00134828623344778,
"acc": 0.27771272034672656,
"acc_stderr": 0.007991508812498901
},
"harness|drop|3": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511154,
"f1": 0.05061136744966456,
"f1_stderr": 0.00134828623344778
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.002001305720948078
},
"harness|winogrande|5": {
"acc": 0.5501183898973955,
"acc_stderr": 0.013981711904049725
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-3b
|
[
"region:us"
] |
2023-08-17T23:19:24+00:00
|
{"pretty_name": "Evaluation run of stabilityai/stablelm-tuned-alpha-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [stabilityai/stablelm-tuned-alpha-3b](https://huggingface.co/stabilityai/stablelm-tuned-alpha-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T21:41:17.455218](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-3b/blob/main/results_2023-10-15T21-41-17.455218.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511154,\n \"f1\": 0.05061136744966456,\n \"f1_stderr\": 0.00134828623344778,\n \"acc\": 0.27771272034672656,\n \"acc_stderr\": 0.007991508812498901\n },\n \"harness|drop|3\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511154,\n \"f1\": 0.05061136744966456,\n \"f1_stderr\": 0.00134828623344778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \"acc_stderr\": 0.002001305720948078\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5501183898973955,\n \"acc_stderr\": 0.013981711904049725\n }\n}\n```", "repo_url": "https://huggingface.co/stabilityai/stablelm-tuned-alpha-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T21_41_17.455218", "path": ["**/details_harness|drop|3_2023-10-15T21-41-17.455218.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T21-41-17.455218.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T21_41_17.455218", "path": ["**/details_harness|gsm8k|5_2023-10-15T21-41-17.455218.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T21-41-17.455218.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:37.876156.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:49:37.876156.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:49:37.876156.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T21_41_17.455218", "path": ["**/details_harness|winogrande|5_2023-10-15T21-41-17.455218.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T21-41-17.455218.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_49_37.876156", "path": ["results_2023-07-19T14:49:37.876156.parquet"]}, {"split": "2023_10_15T21_41_17.455218", "path": ["results_2023-10-15T21-41-17.455218.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T21-41-17.455218.parquet"]}]}]}
|
2023-10-15T20:41:28+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T21:41:17.455218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T21:41:17.455218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T21:41:17.455218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T21:41:17.455218(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
00b0f8190adcdc2835ffd61c48e916760bc8a9c7
|
# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/stablelm-base-alpha-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-base-alpha-7b](https://huggingface.co/stabilityai/stablelm-base-alpha-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T02:46:02.440007](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b/blob/main/results_2023-10-15T02-46-02.440007.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.06512164429530201,
"em_stderr": 0.0025268524486565633,
"f1": 0.10947252516778525,
"f1_stderr": 0.002753310573502515,
"acc": 0.28006496036017814,
"acc_stderr": 0.00805438189189559
},
"harness|drop|3": {
"em": 0.06512164429530201,
"em_stderr": 0.0025268524486565633,
"f1": 0.10947252516778525,
"f1_stderr": 0.002753310573502515
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.002138670301460488
},
"harness|winogrande|5": {
"acc": 0.5540647198105761,
"acc_stderr": 0.013970093482330692
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b
|
[
"region:us"
] |
2023-08-17T23:19:33+00:00
|
{"pretty_name": "Evaluation run of stabilityai/stablelm-base-alpha-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [stabilityai/stablelm-base-alpha-7b](https://huggingface.co/stabilityai/stablelm-base-alpha-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T02:46:02.440007](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b/blob/main/results_2023-10-15T02-46-02.440007.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.06512164429530201,\n \"em_stderr\": 0.0025268524486565633,\n \"f1\": 0.10947252516778525,\n \"f1_stderr\": 0.002753310573502515,\n \"acc\": 0.28006496036017814,\n \"acc_stderr\": 0.00805438189189559\n },\n \"harness|drop|3\": {\n \"em\": 0.06512164429530201,\n \"em_stderr\": 0.0025268524486565633,\n \"f1\": 0.10947252516778525,\n \"f1_stderr\": 0.002753310573502515\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \"acc_stderr\": 0.002138670301460488\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5540647198105761,\n \"acc_stderr\": 0.013970093482330692\n }\n}\n```", "repo_url": "https://huggingface.co/stabilityai/stablelm-base-alpha-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T02_46_02.440007", "path": ["**/details_harness|drop|3_2023-10-15T02-46-02.440007.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T02-46-02.440007.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T02_46_02.440007", "path": ["**/details_harness|gsm8k|5_2023-10-15T02-46-02.440007.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T02-46-02.440007.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:27:18.412966.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:27:18.412966.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:27:18.412966.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T02_46_02.440007", "path": ["**/details_harness|winogrande|5_2023-10-15T02-46-02.440007.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T02-46-02.440007.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_27_18.412966", "path": ["results_2023-07-19T17:27:18.412966.parquet"]}, {"split": "2023_10_15T02_46_02.440007", "path": ["results_2023-10-15T02-46-02.440007.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T02-46-02.440007.parquet"]}]}]}
|
2023-10-15T01:46:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T02:46:02.440007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T02:46:02.440007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T02:46:02.440007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T02:46:02.440007(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
556bcd97d14020dd629156c16223f5652fff0c03
|
# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/stablelm-base-alpha-7b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-base-alpha-7b-v2](https://huggingface.co/stabilityai/stablelm-base-alpha-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b-v2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-08T16:00:35.193970](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b-v2_public/blob/main/results_2023-11-08T16-00-35.193970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801257,
"f1": 0.050772860738255036,
"f1_stderr": 0.0012096254597904722,
"acc": 0.3554299883973712,
"acc_stderr": 0.008709660261045525
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801257,
"f1": 0.050772860738255036,
"f1_stderr": 0.0012096254597904722
},
"harness|gsm8k|5": {
"acc": 0.02577710386656558,
"acc_stderr": 0.004365042953621817
},
"harness|winogrande|5": {
"acc": 0.6850828729281768,
"acc_stderr": 0.013054277568469231
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b-v2
|
[
"region:us"
] |
2023-08-17T23:19:42+00:00
|
{"pretty_name": "Evaluation run of stabilityai/stablelm-base-alpha-7b-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [stabilityai/stablelm-base-alpha-7b-v2](https://huggingface.co/stabilityai/stablelm-base-alpha-7b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b-v2_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-08T16:00:35.193970](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-7b-v2_public/blob/main/results_2023-11-08T16-00-35.193970.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801257,\n \"f1\": 0.050772860738255036,\n \"f1_stderr\": 0.0012096254597904722,\n \"acc\": 0.3554299883973712,\n \"acc_stderr\": 0.008709660261045525\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801257,\n \"f1\": 0.050772860738255036,\n \"f1_stderr\": 0.0012096254597904722\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02577710386656558,\n \"acc_stderr\": 0.004365042953621817\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6850828729281768,\n \"acc_stderr\": 0.013054277568469231\n }\n}\n```", "repo_url": "https://huggingface.co/stabilityai/stablelm-base-alpha-7b-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_08T16_00_35.193970", "path": ["**/details_harness|drop|3_2023-11-08T16-00-35.193970.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-08T16-00-35.193970.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_08T16_00_35.193970", "path": ["**/details_harness|gsm8k|5_2023-11-08T16-00-35.193970.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-08T16-00-35.193970.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_08T16_00_35.193970", "path": ["**/details_harness|winogrande|5_2023-11-08T16-00-35.193970.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-08T16-00-35.193970.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_08T16_00_35.193970", "path": ["results_2023-11-08T16-00-35.193970.parquet"]}, {"split": "latest", "path": ["results_2023-11-08T16-00-35.193970.parquet"]}]}]}
|
2023-12-01T14:49:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b-v2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b-v2 on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-08T16:00:35.193970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-08T16:00:35.193970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b-v2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-08T16:00:35.193970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
175,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-7b-v2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-7b-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-08T16:00:35.193970(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
c72e73858611378dbfc8723070fd4865e7981818
|
# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/stablelm-base-alpha-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-base-alpha-3b](https://huggingface.co/stabilityai/stablelm-base-alpha-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T10:50:11.367177](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-3b/blob/main/results_2023-09-17T10-50-11.367177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.02506291946308725,
"em_stderr": 0.0016008246934367722,
"f1": 0.061388422818792004,
"f1_stderr": 0.001915830833902014,
"acc": 0.27180878341141224,
"acc_stderr": 0.007931538362491969
},
"harness|drop|3": {
"em": 0.02506291946308725,
"em_stderr": 0.0016008246934367722,
"f1": 0.061388422818792004,
"f1_stderr": 0.001915830833902014
},
"harness|gsm8k|5": {
"acc": 0.004548900682335102,
"acc_stderr": 0.0018535550440036204
},
"harness|winogrande|5": {
"acc": 0.5390686661404893,
"acc_stderr": 0.014009521680980316
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-3b
|
[
"region:us"
] |
2023-08-17T23:19:51+00:00
|
{"pretty_name": "Evaluation run of stabilityai/stablelm-base-alpha-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [stabilityai/stablelm-base-alpha-3b](https://huggingface.co/stabilityai/stablelm-base-alpha-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T10:50:11.367177](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-base-alpha-3b/blob/main/results_2023-09-17T10-50-11.367177.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.02506291946308725,\n \"em_stderr\": 0.0016008246934367722,\n \"f1\": 0.061388422818792004,\n \"f1_stderr\": 0.001915830833902014,\n \"acc\": 0.27180878341141224,\n \"acc_stderr\": 0.007931538362491969\n },\n \"harness|drop|3\": {\n \"em\": 0.02506291946308725,\n \"em_stderr\": 0.0016008246934367722,\n \"f1\": 0.061388422818792004,\n \"f1_stderr\": 0.001915830833902014\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.004548900682335102,\n \"acc_stderr\": 0.0018535550440036204\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5390686661404893,\n \"acc_stderr\": 0.014009521680980316\n }\n}\n```", "repo_url": "https://huggingface.co/stabilityai/stablelm-base-alpha-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T10_50_11.367177", "path": ["**/details_harness|drop|3_2023-09-17T10-50-11.367177.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T10-50-11.367177.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T10_50_11.367177", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-50-11.367177.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T10-50-11.367177.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T14:54:44.981866.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:54:44.981866.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T14:54:44.981866.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T10_50_11.367177", "path": ["**/details_harness|winogrande|5_2023-09-17T10-50-11.367177.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T10-50-11.367177.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T14_54_44.981866", "path": ["results_2023-07-19T14:54:44.981866.parquet"]}, {"split": "2023_09_17T10_50_11.367177", "path": ["results_2023-09-17T10-50-11.367177.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T10-50-11.367177.parquet"]}]}]}
|
2023-09-17T09:50:22+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T10:50:11.367177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T10:50:11.367177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T10:50:11.367177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of stabilityai/stablelm-base-alpha-3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-base-alpha-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T10:50:11.367177(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
cf2faf61a495bf76982a694c156f324f57a10eac
|
# Dataset Card for Evaluation run of stabilityai/StableBeluga-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/StableBeluga-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/StableBeluga-13B](https://huggingface.co/stabilityai/StableBeluga-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__StableBeluga-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T18:18:48.353426](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga-13B/blob/main/results_2023-10-15T18-18-48.353426.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.13968120805369127,
"em_stderr": 0.00355008169467152,
"f1": 0.2125828439597308,
"f1_stderr": 0.0036624757731315858,
"acc": 0.4533641938925533,
"acc_stderr": 0.010674908726298674
},
"harness|drop|3": {
"em": 0.13968120805369127,
"em_stderr": 0.00355008169467152,
"f1": 0.2125828439597308,
"f1_stderr": 0.0036624757731315858
},
"harness|gsm8k|5": {
"acc": 0.1379833206974981,
"acc_stderr": 0.009499777327746841
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.011850040124850508
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_stabilityai__StableBeluga-13B
|
[
"region:us"
] |
2023-08-17T23:20:00+00:00
|
{"pretty_name": "Evaluation run of stabilityai/StableBeluga-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [stabilityai/StableBeluga-13B](https://huggingface.co/stabilityai/StableBeluga-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__StableBeluga-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T18:18:48.353426](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga-13B/blob/main/results_2023-10-15T18-18-48.353426.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.13968120805369127,\n \"em_stderr\": 0.00355008169467152,\n \"f1\": 0.2125828439597308,\n \"f1_stderr\": 0.0036624757731315858,\n \"acc\": 0.4533641938925533,\n \"acc_stderr\": 0.010674908726298674\n },\n \"harness|drop|3\": {\n \"em\": 0.13968120805369127,\n \"em_stderr\": 0.00355008169467152,\n \"f1\": 0.2125828439597308,\n \"f1_stderr\": 0.0036624757731315858\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1379833206974981,\n \"acc_stderr\": 0.009499777327746841\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.011850040124850508\n }\n}\n```", "repo_url": "https://huggingface.co/stabilityai/StableBeluga-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T18_18_48.353426", "path": ["**/details_harness|drop|3_2023-10-15T18-18-48.353426.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T18-18-48.353426.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T18_18_48.353426", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-18-48.353426.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T18-18-48.353426.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:19:00.568083.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:19:00.568083.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:19:00.568083.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T18_18_48.353426", "path": ["**/details_harness|winogrande|5_2023-10-15T18-18-48.353426.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T18-18-48.353426.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T13_19_00.568083", "path": ["results_2023-07-31T13:19:00.568083.parquet"]}, {"split": "2023_10_15T18_18_48.353426", "path": ["results_2023-10-15T18-18-48.353426.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T18-18-48.353426.parquet"]}]}]}
|
2023-10-15T17:19:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of stabilityai/StableBeluga-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model stabilityai/StableBeluga-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T18:18:48.353426(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of stabilityai/StableBeluga-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/StableBeluga-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T18:18:48.353426(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stabilityai/StableBeluga-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/StableBeluga-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T18:18:48.353426(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of stabilityai/StableBeluga-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/StableBeluga-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T18:18:48.353426(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
3541550438af6f7977c38b0e8ff6b130526364b0
|
# Dataset Card for Evaluation run of stabilityai/StableBeluga-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/StableBeluga-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/StableBeluga-7B](https://huggingface.co/stabilityai/StableBeluga-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__StableBeluga-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T22:12:45.590144](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga-7B/blob/main/results_2023-09-22T22-12-45.590144.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08557046979865772,
"em_stderr": 0.0028646840549845006,
"f1": 0.15811556208053656,
"f1_stderr": 0.003126158993030364,
"acc": 0.4151299715828343,
"acc_stderr": 0.009762520250486784
},
"harness|drop|3": {
"em": 0.08557046979865772,
"em_stderr": 0.0028646840549845006,
"f1": 0.15811556208053656,
"f1_stderr": 0.003126158993030364
},
"harness|gsm8k|5": {
"acc": 0.07808946171341925,
"acc_stderr": 0.007390654481108218
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.01213438601986535
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_stabilityai__StableBeluga-7B
|
[
"region:us"
] |
2023-08-17T23:20:09+00:00
|
{"pretty_name": "Evaluation run of stabilityai/StableBeluga-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [stabilityai/StableBeluga-7B](https://huggingface.co/stabilityai/StableBeluga-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__StableBeluga-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-22T22:12:45.590144](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__StableBeluga-7B/blob/main/results_2023-09-22T22-12-45.590144.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08557046979865772,\n \"em_stderr\": 0.0028646840549845006,\n \"f1\": 0.15811556208053656,\n \"f1_stderr\": 0.003126158993030364,\n \"acc\": 0.4151299715828343,\n \"acc_stderr\": 0.009762520250486784\n },\n \"harness|drop|3\": {\n \"em\": 0.08557046979865772,\n \"em_stderr\": 0.0028646840549845006,\n \"f1\": 0.15811556208053656,\n \"f1_stderr\": 0.003126158993030364\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07808946171341925,\n \"acc_stderr\": 0.007390654481108218\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.01213438601986535\n }\n}\n```", "repo_url": "https://huggingface.co/stabilityai/StableBeluga-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_22T22_12_45.590144", "path": ["**/details_harness|drop|3_2023-09-22T22-12-45.590144.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-22T22-12-45.590144.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_22T22_12_45.590144", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-12-45.590144.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-22T22-12-45.590144.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-31T13:55:35.368331.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:55:35.368331.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-31T13:55:35.368331.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_22T22_12_45.590144", "path": ["**/details_harness|winogrande|5_2023-09-22T22-12-45.590144.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-22T22-12-45.590144.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_31T13_55_35.368331", "path": ["results_2023-07-31T13:55:35.368331.parquet"]}, {"split": "2023_09_22T22_12_45.590144", "path": ["results_2023-09-22T22-12-45.590144.parquet"]}, {"split": "latest", "path": ["results_2023-09-22T22-12-45.590144.parquet"]}]}]}
|
2023-09-22T21:12:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of stabilityai/StableBeluga-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model stabilityai/StableBeluga-7B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-22T22:12:45.590144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of stabilityai/StableBeluga-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/StableBeluga-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T22:12:45.590144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stabilityai/StableBeluga-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/StableBeluga-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-22T22:12:45.590144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of stabilityai/StableBeluga-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/StableBeluga-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-22T22:12:45.590144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
84b22cc8fe24aca2b8f8ba68c5ef9e3bd77c79cc
|
# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/stabilityai/stablelm-tuned-alpha-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [stabilityai/stablelm-tuned-alpha-7b](https://huggingface.co/stabilityai/stablelm-tuned-alpha-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T19:40:34.606567](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-7b/blob/main/results_2023-10-12T19-40-34.606567.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0041946308724832215,
"em_stderr": 0.0006618716168266466,
"f1": 0.05621224832214779,
"f1_stderr": 0.0014117433231649174,
"acc": 0.2697578287825378,
"acc_stderr": 0.008265042433750026
},
"harness|drop|3": {
"em": 0.0041946308724832215,
"em_stderr": 0.0006618716168266466,
"f1": 0.05621224832214779,
"f1_stderr": 0.0014117433231649174
},
"harness|gsm8k|5": {
"acc": 0.008339651250947688,
"acc_stderr": 0.002504942226860537
},
"harness|winogrande|5": {
"acc": 0.5311760063141279,
"acc_stderr": 0.014025142640639513
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-7b
|
[
"region:us"
] |
2023-08-17T23:20:18+00:00
|
{"pretty_name": "Evaluation run of stabilityai/stablelm-tuned-alpha-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [stabilityai/stablelm-tuned-alpha-7b](https://huggingface.co/stabilityai/stablelm-tuned-alpha-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T19:40:34.606567](https://huggingface.co/datasets/open-llm-leaderboard/details_stabilityai__stablelm-tuned-alpha-7b/blob/main/results_2023-10-12T19-40-34.606567.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0041946308724832215,\n \"em_stderr\": 0.0006618716168266466,\n \"f1\": 0.05621224832214779,\n \"f1_stderr\": 0.0014117433231649174,\n \"acc\": 0.2697578287825378,\n \"acc_stderr\": 0.008265042433750026\n },\n \"harness|drop|3\": {\n \"em\": 0.0041946308724832215,\n \"em_stderr\": 0.0006618716168266466,\n \"f1\": 0.05621224832214779,\n \"f1_stderr\": 0.0014117433231649174\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.008339651250947688,\n \"acc_stderr\": 0.002504942226860537\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5311760063141279,\n \"acc_stderr\": 0.014025142640639513\n }\n}\n```", "repo_url": "https://huggingface.co/stabilityai/stablelm-tuned-alpha-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T19_40_34.606567", "path": ["**/details_harness|drop|3_2023-10-12T19-40-34.606567.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T19-40-34.606567.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T19_40_34.606567", "path": ["**/details_harness|gsm8k|5_2023-10-12T19-40-34.606567.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T19-40-34.606567.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:40.596532.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:04:40.596532.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-19T17:04:40.596532.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T19_40_34.606567", "path": ["**/details_harness|winogrande|5_2023-10-12T19-40-34.606567.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T19-40-34.606567.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_19T17_04_40.596532", "path": ["results_2023-07-19T17:04:40.596532.parquet"]}, {"split": "2023_10_12T19_40_34.606567", "path": ["results_2023-10-12T19-40-34.606567.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T19-40-34.606567.parquet"]}]}]}
|
2023-10-12T18:40:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-12T19:40:34.606567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T19:40:34.606567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T19:40:34.606567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of stabilityai/stablelm-tuned-alpha-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model stabilityai/stablelm-tuned-alpha-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T19:40:34.606567(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
4489de023f8da86065ae2087fc596ba410223f63
|
# Dataset Card for Evaluation run of Locutusque/gpt2-conversational-or-qa
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Locutusque/gpt2-conversational-or-qa
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Locutusque/gpt2-conversational-or-qa](https://huggingface.co/Locutusque/gpt2-conversational-or-qa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__gpt2-conversational-or-qa",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T06:39:40.166876](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__gpt2-conversational-or-qa/blob/main/results_2023-09-17T06-39-40.166876.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00041946308724832214,
"em_stderr": 0.00020969854707829385,
"f1": 0.015460360738255055,
"f1_stderr": 0.0006333702020804492,
"acc": 0.25610125343097334,
"acc_stderr": 0.007403477156790923
},
"harness|drop|3": {
"em": 0.00041946308724832214,
"em_stderr": 0.00020969854707829385,
"f1": 0.015460360738255055,
"f1_stderr": 0.0006333702020804492
},
"harness|gsm8k|5": {
"acc": 0.000758150113722517,
"acc_stderr": 0.0007581501137225174
},
"harness|winogrande|5": {
"acc": 0.5114443567482242,
"acc_stderr": 0.014048804199859329
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Locutusque__gpt2-conversational-or-qa
|
[
"region:us"
] |
2023-08-17T23:20:27+00:00
|
{"pretty_name": "Evaluation run of Locutusque/gpt2-conversational-or-qa", "dataset_summary": "Dataset automatically created during the evaluation run of model [Locutusque/gpt2-conversational-or-qa](https://huggingface.co/Locutusque/gpt2-conversational-or-qa) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__gpt2-conversational-or-qa\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-17T06:39:40.166876](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__gpt2-conversational-or-qa/blob/main/results_2023-09-17T06-39-40.166876.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00041946308724832214,\n \"em_stderr\": 0.00020969854707829385,\n \"f1\": 0.015460360738255055,\n \"f1_stderr\": 0.0006333702020804492,\n \"acc\": 0.25610125343097334,\n \"acc_stderr\": 0.007403477156790923\n },\n \"harness|drop|3\": {\n \"em\": 0.00041946308724832214,\n \"em_stderr\": 0.00020969854707829385,\n \"f1\": 0.015460360738255055,\n \"f1_stderr\": 0.0006333702020804492\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.000758150113722517,\n \"acc_stderr\": 0.0007581501137225174\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5114443567482242,\n \"acc_stderr\": 0.014048804199859329\n }\n}\n```", "repo_url": "https://huggingface.co/Locutusque/gpt2-conversational-or-qa", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_17T06_39_40.166876", "path": ["**/details_harness|drop|3_2023-09-17T06-39-40.166876.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-17T06-39-40.166876.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_17T06_39_40.166876", "path": ["**/details_harness|gsm8k|5_2023-09-17T06-39-40.166876.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-17T06-39-40.166876.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-management|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-virology|5_2023-07-18T16:08:01.149355.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:08:01.149355.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-07-18T16:08:01.149355.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_17T06_39_40.166876", "path": ["**/details_harness|winogrande|5_2023-09-17T06-39-40.166876.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-17T06-39-40.166876.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_07_18T16_08_01.149355", "path": ["results_2023-07-18T16:08:01.149355.parquet"]}, {"split": "2023_09_17T06_39_40.166876", "path": ["results_2023-09-17T06-39-40.166876.parquet"]}, {"split": "latest", "path": ["results_2023-09-17T06-39-40.166876.parquet"]}]}]}
|
2023-09-17T05:39:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Locutusque/gpt2-conversational-or-qa
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Locutusque/gpt2-conversational-or-qa on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-17T06:39:40.166876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Locutusque/gpt2-conversational-or-qa",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutusque/gpt2-conversational-or-qa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T06:39:40.166876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Locutusque/gpt2-conversational-or-qa",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutusque/gpt2-conversational-or-qa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-17T06:39:40.166876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Locutusque/gpt2-conversational-or-qa## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Locutusque/gpt2-conversational-or-qa on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-17T06:39:40.166876(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
5cf62a8e5185913e54db432c9d087f7408aaf6df
|
# Dataset of kijin_seija/鬼人正邪/키진세이쟈 (Touhou)
This is the dataset of kijin_seija/鬼人正邪/키진세이쟈 (Touhou), containing 500 images and their tags.
The core tags of this character are `black_hair, streaked_hair, horns, multicolored_hair, red_hair, white_hair, red_eyes, short_hair, bow, hair_between_eyes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 649.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kijin_seija_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 357.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kijin_seija_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1182 | 754.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kijin_seija_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 566.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kijin_seija_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1182 | 1.06 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kijin_seija_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kijin_seija_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 21 |  |  |  |  |  | 1girl, looking_at_viewer, puffy_short_sleeves, solo, white_dress, bracelet, nail_polish, arrow_(symbol), white_background, blue_bowtie, sash, simple_background, black_nails, grin, upside-down |
| 1 | 7 |  |  |  |  |  | 1girl, bracelet, dress, short_sleeves, solo, tongue_out, arrow_(symbol), looking_at_viewer, smile, puffy_sleeves, sandals, upside-down |
| 2 | 6 |  |  |  |  |  | 1girl, dress, short_sleeves, solo, tongue_out, looking_at_viewer |
| 3 | 5 |  |  |  |  |  | 1girl, dress, looking_at_viewer, short_sleeves, solo, smile, open_mouth, upside-down |
| 4 | 5 |  |  |  |  |  | 1girl, dress, grin, solo, arrow_(symbol), sharp_teeth, bracelet, sandals |
| 5 | 6 |  |  |  |  |  | 1girl, simple_background, solo, white_background, looking_at_viewer, smile, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | puffy_short_sleeves | solo | white_dress | bracelet | nail_polish | arrow_(symbol) | white_background | blue_bowtie | sash | simple_background | black_nails | grin | upside-down | dress | short_sleeves | tongue_out | smile | puffy_sleeves | sandals | open_mouth | sharp_teeth | upper_body |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:----------------------|:-------|:--------------|:-----------|:--------------|:-----------------|:-------------------|:--------------|:-------|:--------------------|:--------------|:-------|:--------------|:--------|:----------------|:-------------|:--------|:----------------|:----------|:-------------|:--------------|:-------------|
| 0 | 21 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | | X | | X | | | | | | | X | X | X | X | X | X | X | | | |
| 2 | 6 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | X | X | X | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | | | | | | | | | | X | X | X | | X | | | X | | |
| 4 | 5 |  |  |  |  |  | X | | | X | | X | | X | | | | | | X | | X | | | | | X | | X | |
| 5 | 6 |  |  |  |  |  | X | X | | X | | | | | X | | | X | | | | | | | X | | | | | X |
|
CyberHarem/kijin_seija_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T23:21:25+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T18:18:18+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of kijin\_seija/鬼人正邪/키진세이쟈 (Touhou)
===========================================
This is the dataset of kijin\_seija/鬼人正邪/키진세이쟈 (Touhou), containing 500 images and their tags.
The core tags of this character are 'black\_hair, streaked\_hair, horns, multicolored\_hair, red\_hair, white\_hair, red\_eyes, short\_hair, bow, hair\_between\_eyes, bangs', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
12ddcccf3ae5e5ad72c896d1c1765aafde8a5b30
|
# Dataset of houraisan_kaguya/蓬莱山輝夜/호라이산카구야 (Touhou)
This is the dataset of houraisan_kaguya/蓬莱山輝夜/호라이산카구야 (Touhou), containing 500 images and their tags.
The core tags of this character are `long_hair, black_hair, very_long_hair, bow, red_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 631.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 410.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1026 | 722.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 576.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1026 | 939.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/houraisan_kaguya_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/houraisan_kaguya_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, hime_cut, long_sleeves, pink_shirt, red_skirt, solo, wide_sleeves, blunt_bangs, full_moon, looking_at_viewer, white_bowtie, closed_mouth, frilled_shirt_collar, smile, jeweled_branch_of_hourai, night_sky, sleeves_past_wrists, bamboo, frilled_sleeves, holding |
| 1 | 9 |  |  |  |  |  | 1girl, full_moon, long_sleeves, shirt, skirt, solo, wide_sleeves, looking_at_viewer, smile, jeweled_branch_of_hourai, night_sky, starry_sky, bamboo |
| 2 | 10 |  |  |  |  |  | 1girl, jeweled_branch_of_hourai, solo, full_moon, japanese_clothes, skirt, wide_sleeves, smile |
| 3 | 7 |  |  |  |  |  | 1girl, jeweled_branch_of_hourai, solo, wide_sleeves, smile, japanese_clothes, simple_background, white_background |
| 4 | 6 |  |  |  |  |  | 1girl, long_sleeves, looking_at_viewer, skirt, smile, solo, wide_sleeves, full_moon, shirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | hime_cut | long_sleeves | pink_shirt | red_skirt | solo | wide_sleeves | blunt_bangs | full_moon | looking_at_viewer | white_bowtie | closed_mouth | frilled_shirt_collar | smile | jeweled_branch_of_hourai | night_sky | sleeves_past_wrists | bamboo | frilled_sleeves | holding | shirt | skirt | starry_sky | japanese_clothes | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:---------------|:-------------|:------------|:-------|:---------------|:--------------|:------------|:--------------------|:---------------|:---------------|:-----------------------|:--------|:---------------------------|:------------|:----------------------|:---------|:------------------|:----------|:--------|:--------|:-------------|:-------------------|:--------------------|:-------------------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 9 |  |  |  |  |  | X | | X | | | X | X | | X | X | | | | X | X | X | | X | | | X | X | X | | | |
| 2 | 10 |  |  |  |  |  | X | | | | | X | X | | X | | | | | X | X | | | | | | | X | | X | | |
| 3 | 7 |  |  |  |  |  | X | | | | | X | X | | | | | | | X | X | | | | | | | | | X | X | X |
| 4 | 6 |  |  |  |  |  | X | | X | | | X | X | | X | X | | | | X | | | | | | | X | X | | | | |
|
CyberHarem/houraisan_kaguya_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-17T23:21:33+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T14:06:25+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of houraisan\_kaguya/蓬莱山輝夜/호라이산카구야 (Touhou)
===================================================
This is the dataset of houraisan\_kaguya/蓬莱山輝夜/호라이산카구야 (Touhou), containing 500 images and their tags.
The core tags of this character are 'long\_hair, black\_hair, very\_long\_hair, bow, red\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
b4a4669219eb0e5b6027a171c5a17d0ecf6a8c25
|
# Dataset Card for "fin_phrasebank_fix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
judy93536/fin_phrasebank_fix
|
[
"region:us"
] |
2023-08-17T23:36:33+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "label", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 256094, "num_examples": 1924}, {"name": "test", "num_bytes": 47170, "num_examples": 339}], "download_size": 171135, "dataset_size": 303264}}
|
2023-08-17T23:36:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "fin_phrasebank_fix"
More Information needed
|
[
"# Dataset Card for \"fin_phrasebank_fix\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"fin_phrasebank_fix\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"fin_phrasebank_fix\"\n\nMore Information needed"
] |
49ef032b56b4d167a3ea4a08a3d4fbe8a05a017f
|
Classic model: (accepts 40% with 8-char accuracy 85%)
https://github.com/Hecate2/Ignareo
Modern model: (accepts 100% with 8-char accuracy 91%)
https://huggingface.co/AIris-Channel/vit-gpt2-verifycode-caption
|
Hecate2/ISML-captcha
|
[
"region:us"
] |
2023-08-17T23:57:00+00:00
|
{}
|
2023-08-22T00:43:34+00:00
|
[] |
[] |
TAGS
#region-us
|
Classic model: (accepts 40% with 8-char accuracy 85%)
URL
Modern model: (accepts 100% with 8-char accuracy 91%)
URL
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
463f37e12be915a0319504777bdf0bb4d02b83e1
|
# Dataset Card for ShareGPT~~52K~~90K
## Dataset Description
- **Homepage: N/A**
- **Repository: N/A**
- **Paper: N/A**
- **Leaderboard: N/A**
- **Point of Contact:** Ronsor Labs <[email protected]>
### Dataset Summary
This dataset is a collection of approximately ~~52,000~~**90,000** conversations scraped via the ShareGPT API before it was shut down.
These conversations include both user prompts and responses from OpenAI's ChatGPT.
This repository now contains the new 90K conversations version. The previous 52K may
be found in the `old/` directory.
### Supported Tasks and Leaderboards
* text-generation
### Languages
This dataset is expected to primarily consist of messages in English and other Western languages.
## Dataset Structure
### Data Instances
```json
{
"id": "2wtLdZm",
"conversations": [
{
"from": "human",
"value": "Are you familiar with the SAS programming language?"
},
{
"from": "gpt",
"value": "<div class=\"markdown prose w-full break-words dark:prose-invert light\"><p>Yes, I am familiar with the SAS programming language. SAS is a statistical software suite that is widely used in data analysis, data management, and business intelligence applications. It has its own programming language, which is used to manipulate and analyze data, generate reports, and automate data-driven workflows. The language is known for its ease of use and powerful capabilities, and it is commonly used in industries such as healthcare, finance, and government.</p></div>"
},
{
"from": "human",
"value": "Can you write a SAS proc that imports a dataset from a CSV saved on my local machine?"
}
]
}
```
### Data Fields
* **id**: conversation id
* **conversations**: conversation session array
* **from**: ChatGPT ("gpt") or the user ("human")
* **value**: message contents as raw HTML
### Data Splits
N/A
## Dataset Creation
### Curation Rationale
This is a decently large dataset of realistic human-AI conversations which I believe should be released
to the research community.
### Source Data
#### Initial Data Collection and Normalization
This data was collected using the ShareGPT API.
#### Who are the source language producers?
ShareGPT users and OpenAI ChatGPT.
### Annotations
#### Annotation process
N/A
#### Who are the annotators?
N/A
### Personal and Sensitive Information
This dataset *may* contain personal information, if ShareGPT users were sending such information to
ChatGPT. ChatGPT warns users not to submit personal information to it, however, so without further
evaluation, we believe that this dataset should contain little or no personal information.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset may be used to train models that are competitive with OpenAI's ChatGPT. Please filter
this dataset first, as it may contain canned responses, raw HTML, and other undesirable information.
### Discussion of Biases
This dataset exhibits all the biases of OpenAI's ChatGPT models (GPT-3.5 and GPT-4) as well as the
biases of the users who uploaded the conversations.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
None.
### Licensing Information
**CC0: No Rights Reserved.**
The output of machine learning algorithms is uncopyrightable in the United States and other jurisdictions.
**Additionally, the OpenAI terms of service do not apply to this dataset as users of this dataset
are not accessing the OpenAI service.**
### Citation Information
TODO
### Contributions
These conversations were allegedly scraped by an anonymous user on 4chan.
The 90K version was sourced from [this post](https://boards.4channel.org/g/thread/92487155/lmg-local-models-general-snail-edition#p92490887).
Thanks, anon!
|
botp/RyokoAI_ShareGPT52K
|
[
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"language:es",
"language:de",
"language:multilingual",
"license:cc0-1.0",
"conversation",
"rlhf",
"chatgpt",
"gpt-3.5",
"region:us"
] |
2023-08-18T00:03:21+00:00
|
{"language": ["en", "es", "de", "multilingual"], "license": "cc0-1.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "pretty_name": "ShareGPT 90K Conversations", "tags": ["conversation", "rlhf", "chatgpt", "gpt-3.5"], "duplicated_from": "RyokoAI/ShareGPT52K"}
|
2023-08-18T00:03:22+00:00
|
[] |
[
"en",
"es",
"de",
"multilingual"
] |
TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-English #language-Spanish #language-German #language-multilingual #license-cc0-1.0 #conversation #rlhf #chatgpt #gpt-3.5 #region-us
|
# Dataset Card for ShareGPT~~52K~~90K
## Dataset Description
- Homepage: N/A
- Repository: N/A
- Paper: N/A
- Leaderboard: N/A
- Point of Contact: Ronsor Labs <ronsor@URL>
### Dataset Summary
This dataset is a collection of approximately ~~52,000~~90,000 conversations scraped via the ShareGPT API before it was shut down.
These conversations include both user prompts and responses from OpenAI's ChatGPT.
This repository now contains the new 90K conversations version. The previous 52K may
be found in the 'old/' directory.
### Supported Tasks and Leaderboards
* text-generation
### Languages
This dataset is expected to primarily consist of messages in English and other Western languages.
## Dataset Structure
### Data Instances
### Data Fields
* id: conversation id
* conversations: conversation session array
* from: ChatGPT ("gpt") or the user ("human")
* value: message contents as raw HTML
### Data Splits
N/A
## Dataset Creation
### Curation Rationale
This is a decently large dataset of realistic human-AI conversations which I believe should be released
to the research community.
### Source Data
#### Initial Data Collection and Normalization
This data was collected using the ShareGPT API.
#### Who are the source language producers?
ShareGPT users and OpenAI ChatGPT.
### Annotations
#### Annotation process
N/A
#### Who are the annotators?
N/A
### Personal and Sensitive Information
This dataset *may* contain personal information, if ShareGPT users were sending such information to
ChatGPT. ChatGPT warns users not to submit personal information to it, however, so without further
evaluation, we believe that this dataset should contain little or no personal information.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset may be used to train models that are competitive with OpenAI's ChatGPT. Please filter
this dataset first, as it may contain canned responses, raw HTML, and other undesirable information.
### Discussion of Biases
This dataset exhibits all the biases of OpenAI's ChatGPT models (GPT-3.5 and GPT-4) as well as the
biases of the users who uploaded the conversations.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
None.
### Licensing Information
CC0: No Rights Reserved.
The output of machine learning algorithms is uncopyrightable in the United States and other jurisdictions.
Additionally, the OpenAI terms of service do not apply to this dataset as users of this dataset
are not accessing the OpenAI service.
TODO
### Contributions
These conversations were allegedly scraped by an anonymous user on 4chan.
The 90K version was sourced from this post.
Thanks, anon!
|
[
"# Dataset Card for ShareGPT~~52K~~90K",
"## Dataset Description\n\n- Homepage: N/A \n- Repository: N/A \n- Paper: N/A \n- Leaderboard: N/A \n- Point of Contact: Ronsor Labs <ronsor@URL>",
"### Dataset Summary\n\nThis dataset is a collection of approximately ~~52,000~~90,000 conversations scraped via the ShareGPT API before it was shut down.\nThese conversations include both user prompts and responses from OpenAI's ChatGPT.\n\nThis repository now contains the new 90K conversations version. The previous 52K may\nbe found in the 'old/' directory.",
"### Supported Tasks and Leaderboards\n\n* text-generation",
"### Languages\n\nThis dataset is expected to primarily consist of messages in English and other Western languages.",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* id: conversation id\n* conversations: conversation session array\n * from: ChatGPT (\"gpt\") or the user (\"human\")\n * value: message contents as raw HTML",
"### Data Splits\n\nN/A",
"## Dataset Creation",
"### Curation Rationale\n\nThis is a decently large dataset of realistic human-AI conversations which I believe should be released\nto the research community.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nThis data was collected using the ShareGPT API.",
"#### Who are the source language producers?\n\nShareGPT users and OpenAI ChatGPT.",
"### Annotations",
"#### Annotation process\n\nN/A",
"#### Who are the annotators?\n\nN/A",
"### Personal and Sensitive Information\n\nThis dataset *may* contain personal information, if ShareGPT users were sending such information to\nChatGPT. ChatGPT warns users not to submit personal information to it, however, so without further\nevaluation, we believe that this dataset should contain little or no personal information.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset may be used to train models that are competitive with OpenAI's ChatGPT. Please filter\nthis dataset first, as it may contain canned responses, raw HTML, and other undesirable information.",
"### Discussion of Biases\n\nThis dataset exhibits all the biases of OpenAI's ChatGPT models (GPT-3.5 and GPT-4) as well as the\nbiases of the users who uploaded the conversations.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nNone.",
"### Licensing Information\n\nCC0: No Rights Reserved.\n\nThe output of machine learning algorithms is uncopyrightable in the United States and other jurisdictions.\nAdditionally, the OpenAI terms of service do not apply to this dataset as users of this dataset\nare not accessing the OpenAI service.\n\n\n\nTODO",
"### Contributions\n\nThese conversations were allegedly scraped by an anonymous user on 4chan.\n\nThe 90K version was sourced from this post.\nThanks, anon!"
] |
[
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #language-Spanish #language-German #language-multilingual #license-cc0-1.0 #conversation #rlhf #chatgpt #gpt-3.5 #region-us \n",
"# Dataset Card for ShareGPT~~52K~~90K",
"## Dataset Description\n\n- Homepage: N/A \n- Repository: N/A \n- Paper: N/A \n- Leaderboard: N/A \n- Point of Contact: Ronsor Labs <ronsor@URL>",
"### Dataset Summary\n\nThis dataset is a collection of approximately ~~52,000~~90,000 conversations scraped via the ShareGPT API before it was shut down.\nThese conversations include both user prompts and responses from OpenAI's ChatGPT.\n\nThis repository now contains the new 90K conversations version. The previous 52K may\nbe found in the 'old/' directory.",
"### Supported Tasks and Leaderboards\n\n* text-generation",
"### Languages\n\nThis dataset is expected to primarily consist of messages in English and other Western languages.",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* id: conversation id\n* conversations: conversation session array\n * from: ChatGPT (\"gpt\") or the user (\"human\")\n * value: message contents as raw HTML",
"### Data Splits\n\nN/A",
"## Dataset Creation",
"### Curation Rationale\n\nThis is a decently large dataset of realistic human-AI conversations which I believe should be released\nto the research community.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nThis data was collected using the ShareGPT API.",
"#### Who are the source language producers?\n\nShareGPT users and OpenAI ChatGPT.",
"### Annotations",
"#### Annotation process\n\nN/A",
"#### Who are the annotators?\n\nN/A",
"### Personal and Sensitive Information\n\nThis dataset *may* contain personal information, if ShareGPT users were sending such information to\nChatGPT. ChatGPT warns users not to submit personal information to it, however, so without further\nevaluation, we believe that this dataset should contain little or no personal information.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset may be used to train models that are competitive with OpenAI's ChatGPT. Please filter\nthis dataset first, as it may contain canned responses, raw HTML, and other undesirable information.",
"### Discussion of Biases\n\nThis dataset exhibits all the biases of OpenAI's ChatGPT models (GPT-3.5 and GPT-4) as well as the\nbiases of the users who uploaded the conversations.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nNone.",
"### Licensing Information\n\nCC0: No Rights Reserved.\n\nThe output of machine learning algorithms is uncopyrightable in the United States and other jurisdictions.\nAdditionally, the OpenAI terms of service do not apply to this dataset as users of this dataset\nare not accessing the OpenAI service.\n\n\n\nTODO",
"### Contributions\n\nThese conversations were allegedly scraped by an anonymous user on 4chan.\n\nThe 90K version was sourced from this post.\nThanks, anon!"
] |
[
73,
14,
46,
90,
15,
23,
6,
6,
43,
8,
5,
35,
4,
22,
21,
5,
8,
12,
67,
8,
54,
54,
10,
5,
9,
70,
39
] |
[
"passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #language-Spanish #language-German #language-multilingual #license-cc0-1.0 #conversation #rlhf #chatgpt #gpt-3.5 #region-us \n# Dataset Card for ShareGPT~~52K~~90K## Dataset Description\n\n- Homepage: N/A \n- Repository: N/A \n- Paper: N/A \n- Leaderboard: N/A \n- Point of Contact: Ronsor Labs <ronsor@URL>### Dataset Summary\n\nThis dataset is a collection of approximately ~~52,000~~90,000 conversations scraped via the ShareGPT API before it was shut down.\nThese conversations include both user prompts and responses from OpenAI's ChatGPT.\n\nThis repository now contains the new 90K conversations version. The previous 52K may\nbe found in the 'old/' directory.### Supported Tasks and Leaderboards\n\n* text-generation### Languages\n\nThis dataset is expected to primarily consist of messages in English and other Western languages.## Dataset Structure### Data Instances### Data Fields\n\n* id: conversation id\n* conversations: conversation session array\n * from: ChatGPT (\"gpt\") or the user (\"human\")\n * value: message contents as raw HTML### Data Splits\n\nN/A## Dataset Creation### Curation Rationale\n\nThis is a decently large dataset of realistic human-AI conversations which I believe should be released\nto the research community.### Source Data#### Initial Data Collection and Normalization\n\nThis data was collected using the ShareGPT API.#### Who are the source language producers?\n\nShareGPT users and OpenAI ChatGPT.### Annotations#### Annotation process\n\nN/A#### Who are the annotators?\n\nN/A### Personal and Sensitive Information\n\nThis dataset *may* contain personal information, if ShareGPT users were sending such information to\nChatGPT. ChatGPT warns users not to submit personal information to it, however, so without further\nevaluation, we believe that this dataset should contain little or no personal information."
] |
37e8a61f05b6141ee4e7118a90744eb5fea05f78
|
# Dataset of yakumo_ran/八雲藍/야쿠모란 (Touhou)
This is the dataset of yakumo_ran/八雲藍/야쿠모란 (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, short_hair, fox_tail, tail, multiple_tails, yellow_eyes, hat, animal_ears, fox_ears, breasts, pillow_hat`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 614.16 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 392.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1158 | 786.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 569.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1158 | 1.02 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yakumo_ran_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yakumo_ran_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, solo, tabard, white_dress, wide_sleeves, bangs, looking_at_viewer, white_headwear, blush, frills, large_breasts, simple_background, white_background |
| 1 | 16 |  |  |  |  |  | 1girl, bangs, long_sleeves, solo, tabard, white_dress, wide_sleeves, looking_at_viewer, white_headwear, frills, closed_mouth, simple_background, smile, hands_in_opposite_sleeves, hair_between_eyes, white_background, upper_body, blush |
| 2 | 18 |  |  |  |  |  | 1girl, long_sleeves, solo, tabard, looking_at_viewer, wide_sleeves, hands_in_opposite_sleeves, smile, white_dress, large_breasts, white_background |
| 3 | 18 |  |  |  |  |  | 1girl, hands_in_opposite_sleeves, solo, smile, wide_sleeves |
| 4 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, blush, open_mouth, smile, tabard, large_breasts, no_headwear, upper_body, animal_ear_fluff |
| 5 | 5 |  |  |  |  |  | 1girl, fox_mask, solo |
| 6 | 5 |  |  |  |  |  | 1girl, closed_mouth, jeans, large_breasts, looking_at_viewer, simple_background, slit_pupils, solo, bangs, barefoot, blush, seiza, white_background, blue_pants, full_body, no_tail, short_sleeves, blue_shirt, long_sleeves, sweater |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | long_sleeves | solo | tabard | white_dress | wide_sleeves | bangs | looking_at_viewer | white_headwear | blush | frills | large_breasts | simple_background | white_background | smile | hands_in_opposite_sleeves | hair_between_eyes | upper_body | open_mouth | no_headwear | animal_ear_fluff | fox_mask | jeans | slit_pupils | barefoot | seiza | blue_pants | full_body | no_tail | short_sleeves | blue_shirt | sweater |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:---------------|:-------|:---------|:--------------|:---------------|:--------|:--------------------|:-----------------|:--------|:---------|:----------------|:--------------------|:-------------------|:--------|:----------------------------|:--------------------|:-------------|:-------------|:--------------|:-------------------|:-----------|:--------|:--------------|:-----------|:--------|:-------------|:------------|:----------|:----------------|:-------------|:----------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 2 | 18 |  |  |  |  |  | X | | X | X | X | X | X | | X | | | | X | | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 18 |  |  |  |  |  | X | | | X | | | X | | | | | | | | | X | X | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | X | X | | | | X | | X | | X | | | X | | | X | X | X | X | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | X | X | | | | X | X | | X | | X | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/yakumo_ran_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T00:03:56+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T12:34:51+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of yakumo\_ran/八雲藍/야쿠모란 (Touhou)
========================================
This is the dataset of yakumo\_ran/八雲藍/야쿠모란 (Touhou), containing 500 images and their tags.
The core tags of this character are 'blonde\_hair, short\_hair, fox\_tail, tail, multiple\_tails, yellow\_eyes, hat, animal\_ears, fox\_ears, breasts, pillow\_hat', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
a54521890ccbe967d98a72d5d45fb0a3feb4493c
|
# Dataset of miyako_yoshika/宮古芳香 (Touhou)
This is the dataset of miyako_yoshika/宮古芳香 (Touhou), containing 500 images and their tags.
The core tags of this character are `short_hair, hat, hat_ornament, star_hat_ornament, blue_hair, blue_eyes, bangs, cabbie_hat, ribbon, breasts, purple_headwear, neck_ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 596.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyako_yoshika_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 373.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyako_yoshika_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1138 | 753.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyako_yoshika_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 547.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyako_yoshika_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1138 | 1005.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/miyako_yoshika_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/miyako_yoshika_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, jiangshi, ofuda, skirt, solo, star_(symbol), zombie_pose, open_mouth, smile |
| 1 | 6 |  |  |  |  |  | 1girl, jiangshi, ofuda, simple_background, solo, star_(symbol), white_background, zombie_pose, open_mouth, skirt, looking_at_viewer |
| 2 | 10 |  |  |  |  |  | 1girl, black_ribbon, black_skirt, jiangshi, looking_at_viewer, ofuda, open_mouth, red_shirt, short_sleeves, solo, star_(symbol), zombie_pose, cowboy_shot, lace-trimmed_sleeves, tangzhuang, medium_breasts, sharp_teeth, smile, hitodama |
| 3 | 5 |  |  |  |  |  | 1girl, black_footwear, black_skirt, blue_headwear, full_body, jiangshi, ofuda, open_mouth, red_shirt, short_sleeves, simple_background, solo, star_(symbol), white_background, zombie_pose, lace-trimmed_sleeves, looking_at_viewer, sharp_teeth, tangzhuang, shoes, smile, black_ribbon, wide_sleeves |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | jiangshi | ofuda | skirt | solo | star_(symbol) | zombie_pose | open_mouth | smile | simple_background | white_background | looking_at_viewer | black_ribbon | black_skirt | red_shirt | short_sleeves | cowboy_shot | lace-trimmed_sleeves | tangzhuang | medium_breasts | sharp_teeth | hitodama | black_footwear | blue_headwear | full_body | shoes | wide_sleeves |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:--------|:--------|:-------|:----------------|:--------------|:-------------|:--------|:--------------------|:-------------------|:--------------------|:---------------|:--------------|:------------|:----------------|:--------------|:-----------------------|:-------------|:-----------------|:--------------|:-----------|:-----------------|:----------------|:------------|:--------|:---------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | |
| 2 | 10 |  |  |  |  |  | X | X | X | | X | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | | X | | X | X | X | X | X |
|
CyberHarem/miyako_yoshika_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T00:13:39+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T18:17:41+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of miyako\_yoshika/宮古芳香 (Touhou)
========================================
This is the dataset of miyako\_yoshika/宮古芳香 (Touhou), containing 500 images and their tags.
The core tags of this character are 'short\_hair, hat, hat\_ornament, star\_hat\_ornament, blue\_hair, blue\_eyes, bangs, cabbie\_hat, ribbon, breasts, purple\_headwear, neck\_ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
e8947229cb473b4c9fde8147a980757285806d5f
|
# Dataset Card for BLiterature
*BLiterature is part of a bigger project that is not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- **Homepage:** (TODO)
- **Repository:** N/A
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** KaraKaraWitch
### Dataset Summary
BLiterature is a raw dataset dump consisting of text from at most 260,261,224 blog posts (excluding categories and date-grouped posts) from blog.fc2.com.
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* Japanese
## Dataset Structure
All the files are located in jsonl files that has been compressed into archives of 7z.
### Data Instances
```json
["http://1kimono.blog49.fc2.com/blog-entry-50.html",
"<!DOCTYPE HTML\n\tPUBLIC \"-//W3C//DTD HTML 4.01 Transitional//EN\"\n\t\t\"http://www.w3.org/TR/html4/loose.dtd\">\n<!--\n<!DOCTYPE HTML\n\tPUBLIC \"-//W3C//DTD HTML 4.01//EN\"\n\t\t\"http://www.w3.org/T... (TRUNCATED)"]
```
### Data Fields
There is only 2 fields in the list. URL and content retrieved. content retrieved may contain values which the scraper ran into issues. If so they are marked in xml such as such.
```<?xml version="1.0" encoding="utf-8"?><error>Specifc Error</error>```
URLs may not match the final url in which the page was retrieved from. As they may be redirects present while scraping.
#### Q-Score Distribution
Not Applicable
### Data Splits
The jsonl files were split roughly every 2,500,000 posts. Allow for a slight deviation of 5000 additional posts due to how the files were saved.
## Dataset Creation
### Curation Rationale
fc2 is a Japanese blog hosting website which offers a place for anyone to host their blog on. As a result, the language used compared to other more official sources is more informal and relaxed as anyone can post whatever they personally want.
### Source Data
#### Initial Data Collection and Normalization
None. No normalization is performed as this is a raw dump of the dataset.
#### Who are the source language producers?
The authors of each blog, which may include others to post on their blog domain as well.
### Annotations
#### Annotation process
No Annotations are present.
#### Who are the annotators?
No human annotators.
### Personal and Sensitive Information
As this dataset contains information from individuals, there is a more likely chance to find personally identifiable information. However, we believe that the author has pre-vetted their posts in good faith to avoid such occurrences.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset contains real life referances and revolves around Japanese culture. As such there will be a bias towards it.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
KaraKaraWitch
### Licensing Information
Apache 2.0, for all parts of which KaraKaraWitch may be considered authors. All other material is distributed under fair use principles.
Ronsor Labs additionally is allowed to relicense the dataset as long as it has gone through processing.
### Citation Information
```
@misc{bliterature,
title = {BLiterature: fc2 blogs for the masses.},
author = {KaraKaraWitch},
year = {2023},
howpublished = {\url{https://huggingface.co/datasets/KaraKaraWitch/BLiterature}},
}
```
### Name Etymology
[Literature (リテラチュア) - Reina Ueda (上田麗奈)](https://www.youtube.com/watch?v=Xo1g5HWgaRA)
`Blogs` > `B` + `Literature` > `BLiterature`
### Contributions
- [@KaraKaraWitch (Twitter)](https://twitter.com/KaraKaraWitch) for gathering this dataset.
- [neggles (Github)](https://github.com/neggles) for providing compute for the gathering of dataset.
|
botp/RyokoAI_BLiterature-260M
|
[
"task_categories:text-classification",
"task_categories:text-generation",
"size_categories:100M<n<1B",
"language:jp",
"license:apache-2.0",
"blogs",
"training",
"text",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T00:28:20+00:00
|
{"language": ["jp"], "license": "apache-2.0", "size_categories": ["100M<n<1B"], "task_categories": ["text-classification", "text-generation"], "pretty_name": "BLiterature", "tags": ["blogs", "training", "text", "not-for-all-audiences"], "duplicated_from": "RyokoAI/BLiterature-260M"}
|
2023-08-18T00:28:20+00:00
|
[] |
[
"jp"
] |
TAGS
#task_categories-text-classification #task_categories-text-generation #size_categories-100M<n<1B #language-jp #license-apache-2.0 #blogs #training #text #not-for-all-audiences #region-us
|
# Dataset Card for BLiterature
*BLiterature is part of a bigger project that is not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- Homepage: (TODO)
- Repository: N/A
- Paper: N/A
- Leaderboard: N/A
- Point of Contact: KaraKaraWitch
### Dataset Summary
BLiterature is a raw dataset dump consisting of text from at most 260,261,224 blog posts (excluding categories and date-grouped posts) from URL.
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* Japanese
## Dataset Structure
All the files are located in jsonl files that has been compressed into archives of 7z.
### Data Instances
### Data Fields
There is only 2 fields in the list. URL and content retrieved. content retrieved may contain values which the scraper ran into issues. If so they are marked in xml such as such.
URLs may not match the final url in which the page was retrieved from. As they may be redirects present while scraping.
#### Q-Score Distribution
Not Applicable
### Data Splits
The jsonl files were split roughly every 2,500,000 posts. Allow for a slight deviation of 5000 additional posts due to how the files were saved.
## Dataset Creation
### Curation Rationale
fc2 is a Japanese blog hosting website which offers a place for anyone to host their blog on. As a result, the language used compared to other more official sources is more informal and relaxed as anyone can post whatever they personally want.
### Source Data
#### Initial Data Collection and Normalization
None. No normalization is performed as this is a raw dump of the dataset.
#### Who are the source language producers?
The authors of each blog, which may include others to post on their blog domain as well.
### Annotations
#### Annotation process
No Annotations are present.
#### Who are the annotators?
No human annotators.
### Personal and Sensitive Information
As this dataset contains information from individuals, there is a more likely chance to find personally identifiable information. However, we believe that the author has pre-vetted their posts in good faith to avoid such occurrences.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset contains real life referances and revolves around Japanese culture. As such there will be a bias towards it.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
KaraKaraWitch
### Licensing Information
Apache 2.0, for all parts of which KaraKaraWitch may be considered authors. All other material is distributed under fair use principles.
Ronsor Labs additionally is allowed to relicense the dataset as long as it has gone through processing.
### Name Etymology
Literature (リテラチュア) - Reina Ueda (上田麗奈)
'Blogs' > 'B' + 'Literature' > 'BLiterature'
### Contributions
- @KaraKaraWitch (Twitter) for gathering this dataset.
- neggles (Github) for providing compute for the gathering of dataset.
|
[
"# Dataset Card for BLiterature\n\n*BLiterature is part of a bigger project that is not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: N/A\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: KaraKaraWitch",
"### Dataset Summary\n\nBLiterature is a raw dataset dump consisting of text from at most 260,261,224 blog posts (excluding categories and date-grouped posts) from URL.",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* Japanese",
"## Dataset Structure\n\nAll the files are located in jsonl files that has been compressed into archives of 7z.",
"### Data Instances",
"### Data Fields\n\nThere is only 2 fields in the list. URL and content retrieved. content retrieved may contain values which the scraper ran into issues. If so they are marked in xml such as such. \n\n\n\nURLs may not match the final url in which the page was retrieved from. As they may be redirects present while scraping.",
"#### Q-Score Distribution\n\nNot Applicable",
"### Data Splits\n\nThe jsonl files were split roughly every 2,500,000 posts. Allow for a slight deviation of 5000 additional posts due to how the files were saved.",
"## Dataset Creation",
"### Curation Rationale\n\nfc2 is a Japanese blog hosting website which offers a place for anyone to host their blog on. As a result, the language used compared to other more official sources is more informal and relaxed as anyone can post whatever they personally want.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nNone. No normalization is performed as this is a raw dump of the dataset.",
"#### Who are the source language producers?\n\nThe authors of each blog, which may include others to post on their blog domain as well.",
"### Annotations",
"#### Annotation process\n\nNo Annotations are present.",
"#### Who are the annotators?\n\nNo human annotators.",
"### Personal and Sensitive Information\n\nAs this dataset contains information from individuals, there is a more likely chance to find personally identifiable information. However, we believe that the author has pre-vetted their posts in good faith to avoid such occurrences.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset contains real life referances and revolves around Japanese culture. As such there will be a bias towards it.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nKaraKaraWitch",
"### Licensing Information\n\nApache 2.0, for all parts of which KaraKaraWitch may be considered authors. All other material is distributed under fair use principles. \n\nRonsor Labs additionally is allowed to relicense the dataset as long as it has gone through processing.",
"### Name Etymology\n\nLiterature (リテラチュア) - Reina Ueda (上田麗奈) \n'Blogs' > 'B' + 'Literature' > 'BLiterature'",
"### Contributions\n\n- @KaraKaraWitch (Twitter) for gathering this dataset. \n- neggles (Github) for providing compute for the gathering of dataset."
] |
[
"TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100M<n<1B #language-jp #license-apache-2.0 #blogs #training #text #not-for-all-audiences #region-us \n",
"# Dataset Card for BLiterature\n\n*BLiterature is part of a bigger project that is not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: N/A\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: KaraKaraWitch",
"### Dataset Summary\n\nBLiterature is a raw dataset dump consisting of text from at most 260,261,224 blog posts (excluding categories and date-grouped posts) from URL.",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* Japanese",
"## Dataset Structure\n\nAll the files are located in jsonl files that has been compressed into archives of 7z.",
"### Data Instances",
"### Data Fields\n\nThere is only 2 fields in the list. URL and content retrieved. content retrieved may contain values which the scraper ran into issues. If so they are marked in xml such as such. \n\n\n\nURLs may not match the final url in which the page was retrieved from. As they may be redirects present while scraping.",
"#### Q-Score Distribution\n\nNot Applicable",
"### Data Splits\n\nThe jsonl files were split roughly every 2,500,000 posts. Allow for a slight deviation of 5000 additional posts due to how the files were saved.",
"## Dataset Creation",
"### Curation Rationale\n\nfc2 is a Japanese blog hosting website which offers a place for anyone to host their blog on. As a result, the language used compared to other more official sources is more informal and relaxed as anyone can post whatever they personally want.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nNone. No normalization is performed as this is a raw dump of the dataset.",
"#### Who are the source language producers?\n\nThe authors of each blog, which may include others to post on their blog domain as well.",
"### Annotations",
"#### Annotation process\n\nNo Annotations are present.",
"#### Who are the annotators?\n\nNo human annotators.",
"### Personal and Sensitive Information\n\nAs this dataset contains information from individuals, there is a more likely chance to find personally identifiable information. However, we believe that the author has pre-vetted their posts in good faith to avoid such occurrences.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset contains real life referances and revolves around Japanese culture. As such there will be a bias towards it.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nKaraKaraWitch",
"### Licensing Information\n\nApache 2.0, for all parts of which KaraKaraWitch may be considered authors. All other material is distributed under fair use principles. \n\nRonsor Labs additionally is allowed to relicense the dataset as long as it has gone through processing.",
"### Name Etymology\n\nLiterature (リテラチュア) - Reina Ueda (上田麗奈) \n'Blogs' > 'B' + 'Literature' > 'BLiterature'",
"### Contributions\n\n- @KaraKaraWitch (Twitter) for gathering this dataset. \n- neggles (Github) for providing compute for the gathering of dataset."
] |
[
68,
37,
42,
47,
49,
6,
29,
6,
80,
12,
41,
5,
56,
4,
31,
30,
5,
12,
15,
56,
8,
49,
35,
10,
5,
11,
62,
47,
45
] |
[
"passage: TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100M<n<1B #language-jp #license-apache-2.0 #blogs #training #text #not-for-all-audiences #region-us \n# Dataset Card for BLiterature\n\n*BLiterature is part of a bigger project that is not yet complete. Not all information here may be accurate or accessible.*## Dataset Description\n\n- Homepage: (TODO)\n- Repository: N/A\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: KaraKaraWitch### Dataset Summary\n\nBLiterature is a raw dataset dump consisting of text from at most 260,261,224 blog posts (excluding categories and date-grouped posts) from URL.### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation### Languages\n\n* Japanese## Dataset Structure\n\nAll the files are located in jsonl files that has been compressed into archives of 7z.### Data Instances### Data Fields\n\nThere is only 2 fields in the list. URL and content retrieved. content retrieved may contain values which the scraper ran into issues. If so they are marked in xml such as such. \n\n\n\nURLs may not match the final url in which the page was retrieved from. As they may be redirects present while scraping.#### Q-Score Distribution\n\nNot Applicable### Data Splits\n\nThe jsonl files were split roughly every 2,500,000 posts. Allow for a slight deviation of 5000 additional posts due to how the files were saved.## Dataset Creation### Curation Rationale\n\nfc2 is a Japanese blog hosting website which offers a place for anyone to host their blog on. As a result, the language used compared to other more official sources is more informal and relaxed as anyone can post whatever they personally want.### Source Data"
] |
bc05acf605d3a230838d143dc8bcb06fd949c806
|
# Dataset Card for Syosetu711K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- **Homepage:** (TODO)
- **Repository:** <https://github.com/RyokoAI/BigKnow2022>
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** Ronsor/undeleted <[email protected]>
### Dataset Summary
Syosetu711K is a dataset composed of approximately 711,700 novels scraped from the Japanese novel self-publishing
website Syosetuka ni Narou (JA: 小説家になろう, lit. "Let's Become a Novelist") between March 26 and March 27, 2023.
The dataset contains most if not all novels published on the site, regardless of length or quality; however, we
include metadata so users of this dataset can filter and evaluate its contents.
Syosetu711Kは、日本の小説投稿サイト「小説家になろう」から2023年3月26日から27日にかけてスクレイプされた約711,700冊の小説から
構成されるデータセットです。このデータセットには、長さや品質に関係なく、サイトに掲載されているほとんどの小説が含まれています。ただし、
各小説のIDも含まれているため、小説家になろうAPIを使ってその情報を検索することができます。
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* Japanese
## Dataset Structure
### Data Instances
```json
{
"text": "【小説タイトル】\n焼けて爛れる恋よりも、微睡む優しい愛が欲しい\n【Nコード】\nN5029ID\n【作者名】\n秋暁秋季\n【あらすじ】\n俺の彼女は物凄く気の多い人だった。\nお眼鏡に適う奴が居れば、瞳孔を蕩
けさせる人だった。\nその癖照れ屋で、すぐに目を逸らす。\nな...",
"meta": {
"subset": "syosetu",
"q": 0.6,
"id": "N5029ID",
"author": "秋暁秋季",
"userid": 719797,
"title": "焼けて爛れる恋よりも、微睡む優しい愛が欲しい",
"length": 871,
"points": 0,
"lang": "ja",
"chapters": 1,
"keywords": ["気が多い", "浮気性", "無愛想", "照れる", "嫉妬", "好みではない", "クソデカ感情", "空気のような安心感"],
"isr15": 0,
"genre": 102,
"biggenre": 1
}
}
{
"text": "【小説タイトル】\n【能力者】\n【Nコード】\nN9864IB\n【作者名】\n夢音いちご\n【あらすじ】\n私立アビリティ学園。\n小・中・高・大が一貫となった、大規模な名門校。\nそして、ここは規模の大きさだけ
でなく、ある特殊な制度を設けて\nいることでも有名だ。\nそれ...",
"meta": {
"subset": "syosetu",
"q": 0.6,
"id": "N9864IB",
"author": "夢音いちご",
"userid": 1912777,
"title": "【能力者】",
"length": 2334,
"points": 0,
"lang": "ja",
"chapters": 2,
"keywords": ["ガールズラブ", "身分差", "伝奇", "日常", "青春", "ラブコメ", "女主人公", "学園", "魔法", "超能力"],
"isr15": 0,
"genre": 202,
"biggenre": 2
}
}
```
### Data Fields
* `text`: the actual novel text, all chapters
* `meta`: novel metadata
* `subset`: dataset tag: `syosetu`
* `lang`: dataset language: `ja` (Japanese)
* `id`: novel ID/ncode
* `author`: author name
* `userid`: author user ID
* `title`: novel title
* `length`: novel length in words
* `points`: global points (corresponds to `global_point` from the Syosetu API)
* `q`: q-score (quality score) calculated based on `points`
* `chapters`: number of chapters (corresponds to `general_all_no` from the Syosetu API)
* `keywords`: array of novel keywords (corresponds to `keyword` from the Syosetu API, split on spaces)
* `isr15`: whether the novel is rated R15+
* `genre`: novel genre ID (optional, see Syosetu API documentation)
* `biggenre`: general novel genre ID (optional, see Syosetu API documentation)
* `isr18`: whether the novel is rated R18+
* `nocgenre`: novel genre ID (optional, only available if `isr18` is true, see Syosetu API documentation)
*For further reference, see the Syosetuka ni Narou API documentation: <https://dev.syosetu.com/man/api/> (JA).*
#### Q-Score Distribution
```
0.00: 0
0.10: 0
0.20: 0
0.30: 0
0.40: 0
0.50: 213005
0.60: 331393
0.70: 101971
0.80: 63877
0.90: 1542
1.00: 2
```
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
Syosetuka ni Narou is the most popular website in Japan for authors wishing to self-publish their novels online. Many works on
the site been picked up by large commercial publishers. Because of this, we believe that this dataset provides a large corpus
of high-quality, creative content in the Japanese language.
### Source Data
#### Initial Data Collection and Normalization
*More information about any referenced scripts, commands, or programs used may be found in the BigKnow2022 GitHub repository.*
First, metadata for all novels on the site was gathered into a JSON lines (JSONL) file. The Syosetuka ni Narou API was used to
obtain this information.
Second, this listing was used to create a secondary text file containing a list of only the novel "ncodes," or IDs. This
secondary file was distributed to downloader nodes.
Third, the sister site <https://pdfnovels.net> was queried with each novel ID, and the resulting PDF was saved for later processing.
Fourth, the `pdftotext` tool was used to convert the PDF files to text documents. A few other scripts were then used to clean up
the resulting text files.
Finally, the text files and other metadata were converted into the specified data field schema above, and the resulting JSON entries
were concatenated into the Syosetu711K dataset. The version uploaded to this repository, however, is split into multiple files,
numbered 00 through 20 inclusive.
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Titles and general genre were collected alongside the novel text and IDs.
#### Who are the annotators?
There were no human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content in Japanese.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. **Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.**
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Citation Information
```
@misc{ryokoai2023-bigknow2022,
title = {BigKnow2022: Bringing Language Models Up to Speed},
author = {Ronsor},
year = {2023},
howpublished = {\url{https://github.com/RyokoAI/BigKnow2022}},
}
```
### Contributions
Thanks to @ronsor (GH) for gathering this dataset.
|
botp/RyokoAI_Syosetu711K
|
[
"task_categories:text-classification",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:ja",
"license:apache-2.0",
"novel",
"training",
"region:us"
] |
2023-08-18T00:29:55+00:00
|
{"language": ["ja"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-classification", "text-generation"], "pretty_name": "Syosetuka ni Narou 711K", "tags": ["novel", "training"], "duplicated_from": "RyokoAI/Syosetu711K"}
|
2023-08-18T00:29:56+00:00
|
[] |
[
"ja"
] |
TAGS
#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-Japanese #license-apache-2.0 #novel #training #region-us
|
# Dataset Card for Syosetu711K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- Homepage: (TODO)
- Repository: <URL
- Paper: N/A
- Leaderboard: N/A
- Point of Contact: Ronsor/undeleted <ronsor@URL>
### Dataset Summary
Syosetu711K is a dataset composed of approximately 711,700 novels scraped from the Japanese novel self-publishing
website Syosetuka ni Narou (JA: 小説家になろう, lit. "Let's Become a Novelist") between March 26 and March 27, 2023.
The dataset contains most if not all novels published on the site, regardless of length or quality; however, we
include metadata so users of this dataset can filter and evaluate its contents.
Syosetu711Kは、日本の小説投稿サイト「小説家になろう」から2023年3月26日から27日にかけてスクレイプされた約711,700冊の小説から
構成されるデータセットです。このデータセットには、長さや品質に関係なく、サイトに掲載されているほとんどの小説が含まれています。ただし、
各小説のIDも含まれているため、小説家になろうAPIを使ってその情報を検索することができます。
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* Japanese
## Dataset Structure
### Data Instances
### Data Fields
* 'text': the actual novel text, all chapters
* 'meta': novel metadata
* 'subset': dataset tag: 'syosetu'
* 'lang': dataset language: 'ja' (Japanese)
* 'id': novel ID/ncode
* 'author': author name
* 'userid': author user ID
* 'title': novel title
* 'length': novel length in words
* 'points': global points (corresponds to 'global_point' from the Syosetu API)
* 'q': q-score (quality score) calculated based on 'points'
* 'chapters': number of chapters (corresponds to 'general_all_no' from the Syosetu API)
* 'keywords': array of novel keywords (corresponds to 'keyword' from the Syosetu API, split on spaces)
* 'isr15': whether the novel is rated R15+
* 'genre': novel genre ID (optional, see Syosetu API documentation)
* 'biggenre': general novel genre ID (optional, see Syosetu API documentation)
* 'isr18': whether the novel is rated R18+
* 'nocgenre': novel genre ID (optional, only available if 'isr18' is true, see Syosetu API documentation)
*For further reference, see the Syosetuka ni Narou API documentation: <URL (JA).*
#### Q-Score Distribution
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
Syosetuka ni Narou is the most popular website in Japan for authors wishing to self-publish their novels online. Many works on
the site been picked up by large commercial publishers. Because of this, we believe that this dataset provides a large corpus
of high-quality, creative content in the Japanese language.
### Source Data
#### Initial Data Collection and Normalization
*More information about any referenced scripts, commands, or programs used may be found in the BigKnow2022 GitHub repository.*
First, metadata for all novels on the site was gathered into a JSON lines (JSONL) file. The Syosetuka ni Narou API was used to
obtain this information.
Second, this listing was used to create a secondary text file containing a list of only the novel "ncodes," or IDs. This
secondary file was distributed to downloader nodes.
Third, the sister site <URL> was queried with each novel ID, and the resulting PDF was saved for later processing.
Fourth, the 'pdftotext' tool was used to convert the PDF files to text documents. A few other scripts were then used to clean up
the resulting text files.
Finally, the text files and other metadata were converted into the specified data field schema above, and the resulting JSON entries
were concatenated into the Syosetu711K dataset. The version uploaded to this repository, however, is split into multiple files,
numbered 00 through 20 inclusive.
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Titles and general genre were collected alongside the novel text and IDs.
#### Who are the annotators?
There were no human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content in Japanese.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Contributions
Thanks to @ronsor (GH) for gathering this dataset.
|
[
"# Dataset Card for Syosetu711K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>",
"### Dataset Summary\n\nSyosetu711K is a dataset composed of approximately 711,700 novels scraped from the Japanese novel self-publishing\nwebsite Syosetuka ni Narou (JA: 小説家になろう, lit. \"Let's Become a Novelist\") between March 26 and March 27, 2023.\nThe dataset contains most if not all novels published on the site, regardless of length or quality; however, we\ninclude metadata so users of this dataset can filter and evaluate its contents.\n\nSyosetu711Kは、日本の小説投稿サイト「小説家になろう」から2023年3月26日から27日にかけてスクレイプされた約711,700冊の小説から\n構成されるデータセットです。このデータセットには、長さや品質に関係なく、サイトに掲載されているほとんどの小説が含まれています。ただし、\n各小説のIDも含まれているため、小説家になろうAPIを使ってその情報を検索することができます。",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* Japanese",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* 'text': the actual novel text, all chapters\n* 'meta': novel metadata\n * 'subset': dataset tag: 'syosetu'\n * 'lang': dataset language: 'ja' (Japanese)\n * 'id': novel ID/ncode\n * 'author': author name\n * 'userid': author user ID\n * 'title': novel title\n * 'length': novel length in words\n * 'points': global points (corresponds to 'global_point' from the Syosetu API)\n * 'q': q-score (quality score) calculated based on 'points'\n * 'chapters': number of chapters (corresponds to 'general_all_no' from the Syosetu API)\n * 'keywords': array of novel keywords (corresponds to 'keyword' from the Syosetu API, split on spaces)\n * 'isr15': whether the novel is rated R15+\n * 'genre': novel genre ID (optional, see Syosetu API documentation)\n * 'biggenre': general novel genre ID (optional, see Syosetu API documentation)\n * 'isr18': whether the novel is rated R18+\n * 'nocgenre': novel genre ID (optional, only available if 'isr18' is true, see Syosetu API documentation)\n \n*For further reference, see the Syosetuka ni Narou API documentation: <URL (JA).*",
"#### Q-Score Distribution",
"### Data Splits\n\nNo splitting of the data was performed.",
"## Dataset Creation",
"### Curation Rationale\n\nSyosetuka ni Narou is the most popular website in Japan for authors wishing to self-publish their novels online. Many works on\nthe site been picked up by large commercial publishers. Because of this, we believe that this dataset provides a large corpus\nof high-quality, creative content in the Japanese language.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\n*More information about any referenced scripts, commands, or programs used may be found in the BigKnow2022 GitHub repository.*\n\nFirst, metadata for all novels on the site was gathered into a JSON lines (JSONL) file. The Syosetuka ni Narou API was used to\nobtain this information.\n\nSecond, this listing was used to create a secondary text file containing a list of only the novel \"ncodes,\" or IDs. This\nsecondary file was distributed to downloader nodes.\n\nThird, the sister site <URL> was queried with each novel ID, and the resulting PDF was saved for later processing.\n\nFourth, the 'pdftotext' tool was used to convert the PDF files to text documents. A few other scripts were then used to clean up\nthe resulting text files.\n\nFinally, the text files and other metadata were converted into the specified data field schema above, and the resulting JSON entries\nwere concatenated into the Syosetu711K dataset. The version uploaded to this repository, however, is split into multiple files,\nnumbered 00 through 20 inclusive.",
"#### Who are the source language producers?\n\nThe authors of each novel.",
"### Annotations",
"#### Annotation process\n\nTitles and general genre were collected alongside the novel text and IDs.",
"#### Who are the annotators?\n\nThere were no human annotators.",
"### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content in Japanese.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect\nthe biases of those authors. Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nRonsor Labs",
"### Licensing Information\n\nApache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is\ndistributed under fair use principles.",
"### Contributions\n\nThanks to @ronsor (GH) for gathering this dataset."
] |
[
"TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-Japanese #license-apache-2.0 #novel #training #region-us \n",
"# Dataset Card for Syosetu711K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>",
"### Dataset Summary\n\nSyosetu711K is a dataset composed of approximately 711,700 novels scraped from the Japanese novel self-publishing\nwebsite Syosetuka ni Narou (JA: 小説家になろう, lit. \"Let's Become a Novelist\") between March 26 and March 27, 2023.\nThe dataset contains most if not all novels published on the site, regardless of length or quality; however, we\ninclude metadata so users of this dataset can filter and evaluate its contents.\n\nSyosetu711Kは、日本の小説投稿サイト「小説家になろう」から2023年3月26日から27日にかけてスクレイプされた約711,700冊の小説から\n構成されるデータセットです。このデータセットには、長さや品質に関係なく、サイトに掲載されているほとんどの小説が含まれています。ただし、\n各小説のIDも含まれているため、小説家になろうAPIを使ってその情報を検索することができます。",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* Japanese",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* 'text': the actual novel text, all chapters\n* 'meta': novel metadata\n * 'subset': dataset tag: 'syosetu'\n * 'lang': dataset language: 'ja' (Japanese)\n * 'id': novel ID/ncode\n * 'author': author name\n * 'userid': author user ID\n * 'title': novel title\n * 'length': novel length in words\n * 'points': global points (corresponds to 'global_point' from the Syosetu API)\n * 'q': q-score (quality score) calculated based on 'points'\n * 'chapters': number of chapters (corresponds to 'general_all_no' from the Syosetu API)\n * 'keywords': array of novel keywords (corresponds to 'keyword' from the Syosetu API, split on spaces)\n * 'isr15': whether the novel is rated R15+\n * 'genre': novel genre ID (optional, see Syosetu API documentation)\n * 'biggenre': general novel genre ID (optional, see Syosetu API documentation)\n * 'isr18': whether the novel is rated R18+\n * 'nocgenre': novel genre ID (optional, only available if 'isr18' is true, see Syosetu API documentation)\n \n*For further reference, see the Syosetuka ni Narou API documentation: <URL (JA).*",
"#### Q-Score Distribution",
"### Data Splits\n\nNo splitting of the data was performed.",
"## Dataset Creation",
"### Curation Rationale\n\nSyosetuka ni Narou is the most popular website in Japan for authors wishing to self-publish their novels online. Many works on\nthe site been picked up by large commercial publishers. Because of this, we believe that this dataset provides a large corpus\nof high-quality, creative content in the Japanese language.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\n*More information about any referenced scripts, commands, or programs used may be found in the BigKnow2022 GitHub repository.*\n\nFirst, metadata for all novels on the site was gathered into a JSON lines (JSONL) file. The Syosetuka ni Narou API was used to\nobtain this information.\n\nSecond, this listing was used to create a secondary text file containing a list of only the novel \"ncodes,\" or IDs. This\nsecondary file was distributed to downloader nodes.\n\nThird, the sister site <URL> was queried with each novel ID, and the resulting PDF was saved for later processing.\n\nFourth, the 'pdftotext' tool was used to convert the PDF files to text documents. A few other scripts were then used to clean up\nthe resulting text files.\n\nFinally, the text files and other metadata were converted into the specified data field schema above, and the resulting JSON entries\nwere concatenated into the Syosetu711K dataset. The version uploaded to this repository, however, is split into multiple files,\nnumbered 00 through 20 inclusive.",
"#### Who are the source language producers?\n\nThe authors of each novel.",
"### Annotations",
"#### Annotation process\n\nTitles and general genre were collected alongside the novel text and IDs.",
"#### Who are the annotators?\n\nThere were no human annotators.",
"### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content in Japanese.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect\nthe biases of those authors. Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nRonsor Labs",
"### Licensing Information\n\nApache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is\ndistributed under fair use principles.",
"### Contributions\n\nThanks to @ronsor (GH) for gathering this dataset."
] |
[
59,
40,
48,
211,
49,
6,
6,
6,
341,
8,
15,
5,
75,
4,
268,
17,
5,
22,
17,
30,
8,
51,
71,
10,
5,
10,
44,
21
] |
[
"passage: TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-Japanese #license-apache-2.0 #novel #training #region-us \n# Dataset Card for Syosetu711K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>### Dataset Summary\n\nSyosetu711K is a dataset composed of approximately 711,700 novels scraped from the Japanese novel self-publishing\nwebsite Syosetuka ni Narou (JA: 小説家になろう, lit. \"Let's Become a Novelist\") between March 26 and March 27, 2023.\nThe dataset contains most if not all novels published on the site, regardless of length or quality; however, we\ninclude metadata so users of this dataset can filter and evaluate its contents.\n\nSyosetu711Kは、日本の小説投稿サイト「小説家になろう」から2023年3月26日から27日にかけてスクレイプされた約711,700冊の小説から\n構成されるデータセットです。このデータセットには、長さや品質に関係なく、サイトに掲載されているほとんどの小説が含まれています。ただし、\n各小説のIDも含まれているため、小説家になろうAPIを使ってその情報を検索することができます。### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation### Languages\n\n* Japanese## Dataset Structure### Data Instances",
"passage: ### Data Fields\n\n* 'text': the actual novel text, all chapters\n* 'meta': novel metadata\n * 'subset': dataset tag: 'syosetu'\n * 'lang': dataset language: 'ja' (Japanese)\n * 'id': novel ID/ncode\n * 'author': author name\n * 'userid': author user ID\n * 'title': novel title\n * 'length': novel length in words\n * 'points': global points (corresponds to 'global_point' from the Syosetu API)\n * 'q': q-score (quality score) calculated based on 'points'\n * 'chapters': number of chapters (corresponds to 'general_all_no' from the Syosetu API)\n * 'keywords': array of novel keywords (corresponds to 'keyword' from the Syosetu API, split on spaces)\n * 'isr15': whether the novel is rated R15+\n * 'genre': novel genre ID (optional, see Syosetu API documentation)\n * 'biggenre': general novel genre ID (optional, see Syosetu API documentation)\n * 'isr18': whether the novel is rated R18+\n * 'nocgenre': novel genre ID (optional, only available if 'isr18' is true, see Syosetu API documentation)\n \n*For further reference, see the Syosetuka ni Narou API documentation: <URL (JA).*#### Q-Score Distribution### Data Splits\n\nNo splitting of the data was performed.## Dataset Creation### Curation Rationale\n\nSyosetuka ni Narou is the most popular website in Japan for authors wishing to self-publish their novels online. Many works on\nthe site been picked up by large commercial publishers. Because of this, we believe that this dataset provides a large corpus\nof high-quality, creative content in the Japanese language.### Source Data#### Initial Data Collection and Normalization\n\n*More information about any referenced scripts, commands, or programs used may be found in the BigKnow2022 GitHub repository.*\n\nFirst, metadata for all novels on the site was gathered into a JSON lines (JSONL) file. The Syosetuka ni Narou API was used to\nobtain this information.\n\nSecond, this listing was used to create a secondary text file containing a list of only the novel \"ncodes,\" or IDs. This\nsecondary file was distributed to downloader nodes.\n\nThird, the sister site <URL> was queried with each novel ID, and the resulting PDF was saved for later processing.\n\nFourth, the 'pdftotext' tool was used to convert the PDF files to text documents. A few other scripts were then used to clean up\nthe resulting text files.\n\nFinally, the text files and other metadata were converted into the specified data field schema above, and the resulting JSON entries\nwere concatenated into the Syosetu711K dataset. The version uploaded to this repository, however, is split into multiple files,\nnumbered 00 through 20 inclusive.#### Who are the source language producers?\n\nThe authors of each novel.### Annotations#### Annotation process\n\nTitles and general genre were collected alongside the novel text and IDs.#### Who are the annotators?\n\nThere were no human annotators.### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.## Considerations for Using the Data"
] |
6586bb48380c29d2cc3cf0352d2ccda98cd77ad1
|
# Dataset Card for "PKDD_GPTNEO_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_GPTNEO_Baseline
|
[
"region:us"
] |
2023-08-18T00:31:13+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "768", "dtype": "float32"}, {"name": "769", "dtype": "float32"}, {"name": "770", "dtype": "float32"}, {"name": "771", "dtype": "float32"}, {"name": "772", "dtype": "float32"}, {"name": "773", "dtype": "float32"}, {"name": "774", "dtype": "float32"}, {"name": "775", "dtype": "float32"}, {"name": "776", "dtype": "float32"}, {"name": "777", "dtype": "float32"}, {"name": "778", "dtype": "float32"}, {"name": "779", "dtype": "float32"}, {"name": "780", "dtype": "float32"}, {"name": "781", "dtype": "float32"}, {"name": "782", "dtype": "float32"}, {"name": "783", "dtype": "float32"}, {"name": "784", "dtype": "float32"}, {"name": "785", "dtype": "float32"}, {"name": "786", "dtype": "float32"}, {"name": "787", "dtype": "float32"}, {"name": "788", "dtype": "float32"}, {"name": "789", "dtype": "float32"}, {"name": "790", "dtype": "float32"}, {"name": "791", "dtype": "float32"}, {"name": "792", "dtype": "float32"}, {"name": "793", "dtype": "float32"}, {"name": "794", "dtype": "float32"}, {"name": "795", "dtype": "float32"}, {"name": "796", "dtype": "float32"}, {"name": "797", "dtype": "float32"}, {"name": "798", "dtype": "float32"}, {"name": "799", "dtype": "float32"}, {"name": "800", "dtype": "float32"}, {"name": "801", "dtype": "float32"}, {"name": "802", "dtype": "float32"}, {"name": "803", "dtype": "float32"}, {"name": "804", "dtype": "float32"}, {"name": "805", "dtype": "float32"}, {"name": "806", "dtype": "float32"}, {"name": "807", "dtype": "float32"}, {"name": "808", "dtype": "float32"}, {"name": "809", "dtype": "float32"}, {"name": "810", "dtype": "float32"}, {"name": "811", "dtype": "float32"}, {"name": "812", "dtype": "float32"}, {"name": "813", "dtype": "float32"}, {"name": "814", "dtype": "float32"}, {"name": "815", "dtype": "float32"}, {"name": "816", "dtype": "float32"}, {"name": "817", "dtype": "float32"}, {"name": "818", "dtype": "float32"}, {"name": "819", "dtype": "float32"}, {"name": "820", "dtype": "float32"}, {"name": "821", "dtype": "float32"}, {"name": "822", "dtype": "float32"}, {"name": "823", "dtype": "float32"}, {"name": "824", "dtype": "float32"}, {"name": "825", "dtype": "float32"}, {"name": "826", "dtype": "float32"}, {"name": "827", "dtype": "float32"}, {"name": "828", "dtype": "float32"}, {"name": "829", "dtype": "float32"}, {"name": "830", "dtype": "float32"}, {"name": "831", "dtype": "float32"}, {"name": "832", "dtype": "float32"}, {"name": "833", "dtype": "float32"}, {"name": "834", "dtype": "float32"}, {"name": "835", "dtype": "float32"}, {"name": "836", "dtype": "float32"}, {"name": "837", "dtype": "float32"}, {"name": "838", "dtype": "float32"}, {"name": "839", "dtype": "float32"}, {"name": "840", "dtype": "float32"}, {"name": "841", "dtype": "float32"}, {"name": "842", "dtype": "float32"}, {"name": "843", "dtype": "float32"}, {"name": "844", "dtype": "float32"}, {"name": "845", "dtype": "float32"}, {"name": "846", "dtype": "float32"}, {"name": "847", "dtype": "float32"}, {"name": "848", "dtype": "float32"}, {"name": "849", "dtype": "float32"}, {"name": "850", "dtype": "float32"}, {"name": "851", "dtype": "float32"}, {"name": "852", "dtype": "float32"}, {"name": "853", "dtype": "float32"}, {"name": "854", "dtype": "float32"}, {"name": "855", "dtype": "float32"}, {"name": "856", "dtype": "float32"}, {"name": "857", "dtype": "float32"}, {"name": "858", "dtype": "float32"}, {"name": "859", "dtype": "float32"}, {"name": "860", "dtype": "float32"}, {"name": "861", "dtype": "float32"}, {"name": "862", "dtype": "float32"}, {"name": "863", "dtype": "float32"}, {"name": "864", "dtype": "float32"}, {"name": "865", "dtype": "float32"}, {"name": "866", "dtype": "float32"}, {"name": "867", "dtype": "float32"}, {"name": "868", "dtype": "float32"}, {"name": "869", "dtype": "float32"}, {"name": "870", "dtype": "float32"}, {"name": "871", "dtype": "float32"}, {"name": "872", "dtype": "float32"}, {"name": "873", "dtype": "float32"}, {"name": "874", "dtype": "float32"}, {"name": "875", "dtype": "float32"}, {"name": "876", "dtype": "float32"}, {"name": "877", "dtype": "float32"}, {"name": "878", "dtype": "float32"}, {"name": "879", "dtype": "float32"}, {"name": "880", "dtype": "float32"}, {"name": "881", "dtype": "float32"}, {"name": "882", "dtype": "float32"}, {"name": "883", "dtype": "float32"}, {"name": "884", "dtype": "float32"}, {"name": "885", "dtype": "float32"}, {"name": "886", "dtype": "float32"}, {"name": "887", "dtype": "float32"}, {"name": "888", "dtype": "float32"}, {"name": "889", "dtype": "float32"}, {"name": "890", "dtype": "float32"}, {"name": "891", "dtype": "float32"}, {"name": "892", "dtype": "float32"}, {"name": "893", "dtype": "float32"}, {"name": "894", "dtype": "float32"}, {"name": "895", "dtype": "float32"}, {"name": "896", "dtype": "float32"}, {"name": "897", "dtype": "float32"}, {"name": "898", "dtype": "float32"}, {"name": "899", "dtype": "float32"}, {"name": "900", "dtype": "float32"}, {"name": "901", "dtype": "float32"}, {"name": "902", "dtype": "float32"}, {"name": "903", "dtype": "float32"}, {"name": "904", "dtype": "float32"}, {"name": "905", "dtype": "float32"}, {"name": "906", "dtype": "float32"}, {"name": "907", "dtype": "float32"}, {"name": "908", "dtype": "float32"}, {"name": "909", "dtype": "float32"}, {"name": "910", "dtype": "float32"}, {"name": "911", "dtype": "float32"}, {"name": "912", "dtype": "float32"}, {"name": "913", "dtype": "float32"}, {"name": "914", "dtype": "float32"}, {"name": "915", "dtype": "float32"}, {"name": "916", "dtype": "float32"}, {"name": "917", "dtype": "float32"}, {"name": "918", "dtype": "float32"}, {"name": "919", "dtype": "float32"}, {"name": "920", "dtype": "float32"}, {"name": "921", "dtype": "float32"}, {"name": "922", "dtype": "float32"}, {"name": "923", "dtype": "float32"}, {"name": "924", "dtype": "float32"}, {"name": "925", "dtype": "float32"}, {"name": "926", "dtype": "float32"}, {"name": "927", "dtype": "float32"}, {"name": "928", "dtype": "float32"}, {"name": "929", "dtype": "float32"}, {"name": "930", "dtype": "float32"}, {"name": "931", "dtype": "float32"}, {"name": "932", "dtype": "float32"}, {"name": "933", "dtype": "float32"}, {"name": "934", "dtype": "float32"}, {"name": "935", "dtype": "float32"}, {"name": "936", "dtype": "float32"}, {"name": "937", "dtype": "float32"}, {"name": "938", "dtype": "float32"}, {"name": "939", "dtype": "float32"}, {"name": "940", "dtype": "float32"}, {"name": "941", "dtype": "float32"}, {"name": "942", "dtype": "float32"}, {"name": "943", "dtype": "float32"}, {"name": "944", "dtype": "float32"}, {"name": "945", "dtype": "float32"}, {"name": "946", "dtype": "float32"}, {"name": "947", "dtype": "float32"}, {"name": "948", "dtype": "float32"}, {"name": "949", "dtype": "float32"}, {"name": "950", "dtype": "float32"}, {"name": "951", "dtype": "float32"}, {"name": "952", "dtype": "float32"}, {"name": "953", "dtype": "float32"}, {"name": "954", "dtype": "float32"}, {"name": "955", "dtype": "float32"}, {"name": "956", "dtype": "float32"}, {"name": "957", "dtype": "float32"}, {"name": "958", "dtype": "float32"}, {"name": "959", "dtype": "float32"}, {"name": "960", "dtype": "float32"}, {"name": "961", "dtype": "float32"}, {"name": "962", "dtype": "float32"}, {"name": "963", "dtype": "float32"}, {"name": "964", "dtype": "float32"}, {"name": "965", "dtype": "float32"}, {"name": "966", "dtype": "float32"}, {"name": "967", "dtype": "float32"}, {"name": "968", "dtype": "float32"}, {"name": "969", "dtype": "float32"}, {"name": "970", "dtype": "float32"}, {"name": "971", "dtype": "float32"}, {"name": "972", "dtype": "float32"}, {"name": "973", "dtype": "float32"}, {"name": "974", "dtype": "float32"}, {"name": "975", "dtype": "float32"}, {"name": "976", "dtype": "float32"}, {"name": "977", "dtype": "float32"}, {"name": "978", "dtype": "float32"}, {"name": "979", "dtype": "float32"}, {"name": "980", "dtype": "float32"}, {"name": "981", "dtype": "float32"}, {"name": "982", "dtype": "float32"}, {"name": "983", "dtype": "float32"}, {"name": "984", "dtype": "float32"}, {"name": "985", "dtype": "float32"}, {"name": "986", "dtype": "float32"}, {"name": "987", "dtype": "float32"}, {"name": "988", "dtype": "float32"}, {"name": "989", "dtype": "float32"}, {"name": "990", "dtype": "float32"}, {"name": "991", "dtype": "float32"}, {"name": "992", "dtype": "float32"}, {"name": "993", "dtype": "float32"}, {"name": "994", "dtype": "float32"}, {"name": "995", "dtype": "float32"}, {"name": "996", "dtype": "float32"}, {"name": "997", "dtype": "float32"}, {"name": "998", "dtype": "float32"}, {"name": "999", "dtype": "float32"}, {"name": "1000", "dtype": "float32"}, {"name": "1001", "dtype": "float32"}, {"name": "1002", "dtype": "float32"}, {"name": "1003", "dtype": "float32"}, {"name": "1004", "dtype": "float32"}, {"name": "1005", "dtype": "float32"}, {"name": "1006", "dtype": "float32"}, {"name": "1007", "dtype": "float32"}, {"name": "1008", "dtype": "float32"}, {"name": "1009", "dtype": "float32"}, {"name": "1010", "dtype": "float32"}, {"name": "1011", "dtype": "float32"}, {"name": "1012", "dtype": "float32"}, {"name": "1013", "dtype": "float32"}, {"name": "1014", "dtype": "float32"}, {"name": "1015", "dtype": "float32"}, {"name": "1016", "dtype": "float32"}, {"name": "1017", "dtype": "float32"}, {"name": "1018", "dtype": "float32"}, {"name": "1019", "dtype": "float32"}, {"name": "1020", "dtype": "float32"}, {"name": "1021", "dtype": "float32"}, {"name": "1022", "dtype": "float32"}, {"name": "1023", "dtype": "float32"}, {"name": "1024", "dtype": "float32"}, {"name": "1025", "dtype": "float32"}, {"name": "1026", "dtype": "float32"}, {"name": "1027", "dtype": "float32"}, {"name": "1028", "dtype": "float32"}, {"name": "1029", "dtype": "float32"}, {"name": "1030", "dtype": "float32"}, {"name": "1031", "dtype": "float32"}, {"name": "1032", "dtype": "float32"}, {"name": "1033", "dtype": "float32"}, {"name": "1034", "dtype": "float32"}, {"name": "1035", "dtype": "float32"}, {"name": "1036", "dtype": "float32"}, {"name": "1037", "dtype": "float32"}, {"name": "1038", "dtype": "float32"}, {"name": "1039", "dtype": "float32"}, {"name": "1040", "dtype": "float32"}, {"name": "1041", "dtype": "float32"}, {"name": "1042", "dtype": "float32"}, {"name": "1043", "dtype": "float32"}, {"name": "1044", "dtype": "float32"}, {"name": "1045", "dtype": "float32"}, {"name": "1046", "dtype": "float32"}, {"name": "1047", "dtype": "float32"}, {"name": "1048", "dtype": "float32"}, {"name": "1049", "dtype": "float32"}, {"name": "1050", "dtype": "float32"}, {"name": "1051", "dtype": "float32"}, {"name": "1052", "dtype": "float32"}, {"name": "1053", "dtype": "float32"}, {"name": "1054", "dtype": "float32"}, {"name": "1055", "dtype": "float32"}, {"name": "1056", "dtype": "float32"}, {"name": "1057", "dtype": "float32"}, {"name": "1058", "dtype": "float32"}, {"name": "1059", "dtype": "float32"}, {"name": "1060", "dtype": "float32"}, {"name": "1061", "dtype": "float32"}, {"name": "1062", "dtype": "float32"}, {"name": "1063", "dtype": "float32"}, {"name": "1064", "dtype": "float32"}, {"name": "1065", "dtype": "float32"}, {"name": "1066", "dtype": "float32"}, {"name": "1067", "dtype": "float32"}, {"name": "1068", "dtype": "float32"}, {"name": "1069", "dtype": "float32"}, {"name": "1070", "dtype": "float32"}, {"name": "1071", "dtype": "float32"}, {"name": "1072", "dtype": "float32"}, {"name": "1073", "dtype": "float32"}, {"name": "1074", "dtype": "float32"}, {"name": "1075", "dtype": "float32"}, {"name": "1076", "dtype": "float32"}, {"name": "1077", "dtype": "float32"}, {"name": "1078", "dtype": "float32"}, {"name": "1079", "dtype": "float32"}, {"name": "1080", "dtype": "float32"}, {"name": "1081", "dtype": "float32"}, {"name": "1082", "dtype": "float32"}, {"name": "1083", "dtype": "float32"}, {"name": "1084", "dtype": "float32"}, {"name": "1085", "dtype": "float32"}, {"name": "1086", "dtype": "float32"}, {"name": "1087", "dtype": "float32"}, {"name": "1088", "dtype": "float32"}, {"name": "1089", "dtype": "float32"}, {"name": "1090", "dtype": "float32"}, {"name": "1091", "dtype": "float32"}, {"name": "1092", "dtype": "float32"}, {"name": "1093", "dtype": "float32"}, {"name": "1094", "dtype": "float32"}, {"name": "1095", "dtype": "float32"}, {"name": "1096", "dtype": "float32"}, {"name": "1097", "dtype": "float32"}, {"name": "1098", "dtype": "float32"}, {"name": "1099", "dtype": "float32"}, {"name": "1100", "dtype": "float32"}, {"name": "1101", "dtype": "float32"}, {"name": "1102", "dtype": "float32"}, {"name": "1103", "dtype": "float32"}, {"name": "1104", "dtype": "float32"}, {"name": "1105", "dtype": "float32"}, {"name": "1106", "dtype": "float32"}, {"name": "1107", "dtype": "float32"}, {"name": "1108", "dtype": "float32"}, {"name": "1109", "dtype": "float32"}, {"name": "1110", "dtype": "float32"}, {"name": "1111", "dtype": "float32"}, {"name": "1112", "dtype": "float32"}, {"name": "1113", "dtype": "float32"}, {"name": "1114", "dtype": "float32"}, {"name": "1115", "dtype": "float32"}, {"name": "1116", "dtype": "float32"}, {"name": "1117", "dtype": "float32"}, {"name": "1118", "dtype": "float32"}, {"name": "1119", "dtype": "float32"}, {"name": "1120", "dtype": "float32"}, {"name": "1121", "dtype": "float32"}, {"name": "1122", "dtype": "float32"}, {"name": "1123", "dtype": "float32"}, {"name": "1124", "dtype": "float32"}, {"name": "1125", "dtype": "float32"}, {"name": "1126", "dtype": "float32"}, {"name": "1127", "dtype": "float32"}, {"name": "1128", "dtype": "float32"}, {"name": "1129", "dtype": "float32"}, {"name": "1130", "dtype": "float32"}, {"name": "1131", "dtype": "float32"}, {"name": "1132", "dtype": "float32"}, {"name": "1133", "dtype": "float32"}, {"name": "1134", "dtype": "float32"}, {"name": "1135", "dtype": "float32"}, {"name": "1136", "dtype": "float32"}, {"name": "1137", "dtype": "float32"}, {"name": "1138", "dtype": "float32"}, {"name": "1139", "dtype": "float32"}, {"name": "1140", "dtype": "float32"}, {"name": "1141", "dtype": "float32"}, {"name": "1142", "dtype": "float32"}, {"name": "1143", "dtype": "float32"}, {"name": "1144", "dtype": "float32"}, {"name": "1145", "dtype": "float32"}, {"name": "1146", "dtype": "float32"}, {"name": "1147", "dtype": "float32"}, {"name": "1148", "dtype": "float32"}, {"name": "1149", "dtype": "float32"}, {"name": "1150", "dtype": "float32"}, {"name": "1151", "dtype": "float32"}, {"name": "1152", "dtype": "float32"}, {"name": "1153", "dtype": "float32"}, {"name": "1154", "dtype": "float32"}, {"name": "1155", "dtype": "float32"}, {"name": "1156", "dtype": "float32"}, {"name": "1157", "dtype": "float32"}, {"name": "1158", "dtype": "float32"}, {"name": "1159", "dtype": "float32"}, {"name": "1160", "dtype": "float32"}, {"name": "1161", "dtype": "float32"}, {"name": "1162", "dtype": "float32"}, {"name": "1163", "dtype": "float32"}, {"name": "1164", "dtype": "float32"}, {"name": "1165", "dtype": "float32"}, {"name": "1166", "dtype": "float32"}, {"name": "1167", "dtype": "float32"}, {"name": "1168", "dtype": "float32"}, {"name": "1169", "dtype": "float32"}, {"name": "1170", "dtype": "float32"}, {"name": "1171", "dtype": "float32"}, {"name": "1172", "dtype": "float32"}, {"name": "1173", "dtype": "float32"}, {"name": "1174", "dtype": "float32"}, {"name": "1175", "dtype": "float32"}, {"name": "1176", "dtype": "float32"}, {"name": "1177", "dtype": "float32"}, {"name": "1178", "dtype": "float32"}, {"name": "1179", "dtype": "float32"}, {"name": "1180", "dtype": "float32"}, {"name": "1181", "dtype": "float32"}, {"name": "1182", "dtype": "float32"}, {"name": "1183", "dtype": "float32"}, {"name": "1184", "dtype": "float32"}, {"name": "1185", "dtype": "float32"}, {"name": "1186", "dtype": "float32"}, {"name": "1187", "dtype": "float32"}, {"name": "1188", "dtype": "float32"}, {"name": "1189", "dtype": "float32"}, {"name": "1190", "dtype": "float32"}, {"name": "1191", "dtype": "float32"}, {"name": "1192", "dtype": "float32"}, {"name": "1193", "dtype": "float32"}, {"name": "1194", "dtype": "float32"}, {"name": "1195", "dtype": "float32"}, {"name": "1196", "dtype": "float32"}, {"name": "1197", "dtype": "float32"}, {"name": "1198", "dtype": "float32"}, {"name": "1199", "dtype": "float32"}, {"name": "1200", "dtype": "float32"}, {"name": "1201", "dtype": "float32"}, {"name": "1202", "dtype": "float32"}, {"name": "1203", "dtype": "float32"}, {"name": "1204", "dtype": "float32"}, {"name": "1205", "dtype": "float32"}, {"name": "1206", "dtype": "float32"}, {"name": "1207", "dtype": "float32"}, {"name": "1208", "dtype": "float32"}, {"name": "1209", "dtype": "float32"}, {"name": "1210", "dtype": "float32"}, {"name": "1211", "dtype": "float32"}, {"name": "1212", "dtype": "float32"}, {"name": "1213", "dtype": "float32"}, {"name": "1214", "dtype": "float32"}, {"name": "1215", "dtype": "float32"}, {"name": "1216", "dtype": "float32"}, {"name": "1217", "dtype": "float32"}, {"name": "1218", "dtype": "float32"}, {"name": "1219", "dtype": "float32"}, {"name": "1220", "dtype": "float32"}, {"name": "1221", "dtype": "float32"}, {"name": "1222", "dtype": "float32"}, {"name": "1223", "dtype": "float32"}, {"name": "1224", "dtype": "float32"}, {"name": "1225", "dtype": "float32"}, {"name": "1226", "dtype": "float32"}, {"name": "1227", "dtype": "float32"}, {"name": "1228", "dtype": "float32"}, {"name": "1229", "dtype": "float32"}, {"name": "1230", "dtype": "float32"}, {"name": "1231", "dtype": "float32"}, {"name": "1232", "dtype": "float32"}, {"name": "1233", "dtype": "float32"}, {"name": "1234", "dtype": "float32"}, {"name": "1235", "dtype": "float32"}, {"name": "1236", "dtype": "float32"}, {"name": "1237", "dtype": "float32"}, {"name": "1238", "dtype": "float32"}, {"name": "1239", "dtype": "float32"}, {"name": "1240", "dtype": "float32"}, {"name": "1241", "dtype": "float32"}, {"name": "1242", "dtype": "float32"}, {"name": "1243", "dtype": "float32"}, {"name": "1244", "dtype": "float32"}, {"name": "1245", "dtype": "float32"}, {"name": "1246", "dtype": "float32"}, {"name": "1247", "dtype": "float32"}, {"name": "1248", "dtype": "float32"}, {"name": "1249", "dtype": "float32"}, {"name": "1250", "dtype": "float32"}, {"name": "1251", "dtype": "float32"}, {"name": "1252", "dtype": "float32"}, {"name": "1253", "dtype": "float32"}, {"name": "1254", "dtype": "float32"}, {"name": "1255", "dtype": "float32"}, {"name": "1256", "dtype": "float32"}, {"name": "1257", "dtype": "float32"}, {"name": "1258", "dtype": "float32"}, {"name": "1259", "dtype": "float32"}, {"name": "1260", "dtype": "float32"}, {"name": "1261", "dtype": "float32"}, {"name": "1262", "dtype": "float32"}, {"name": "1263", "dtype": "float32"}, {"name": "1264", "dtype": "float32"}, {"name": "1265", "dtype": "float32"}, {"name": "1266", "dtype": "float32"}, {"name": "1267", "dtype": "float32"}, {"name": "1268", "dtype": "float32"}, {"name": "1269", "dtype": "float32"}, {"name": "1270", "dtype": "float32"}, {"name": "1271", "dtype": "float32"}, {"name": "1272", "dtype": "float32"}, {"name": "1273", "dtype": "float32"}, {"name": "1274", "dtype": "float32"}, {"name": "1275", "dtype": "float32"}, {"name": "1276", "dtype": "float32"}, {"name": "1277", "dtype": "float32"}, {"name": "1278", "dtype": "float32"}, {"name": "1279", "dtype": "float32"}, {"name": "1280", "dtype": "float32"}, {"name": "1281", "dtype": "float32"}, {"name": "1282", "dtype": "float32"}, {"name": "1283", "dtype": "float32"}, {"name": "1284", "dtype": "float32"}, {"name": "1285", "dtype": "float32"}, {"name": "1286", "dtype": "float32"}, {"name": "1287", "dtype": "float32"}, {"name": "1288", "dtype": "float32"}, {"name": "1289", "dtype": "float32"}, {"name": "1290", "dtype": "float32"}, {"name": "1291", "dtype": "float32"}, {"name": "1292", "dtype": "float32"}, {"name": "1293", "dtype": "float32"}, {"name": "1294", "dtype": "float32"}, {"name": "1295", "dtype": "float32"}, {"name": "1296", "dtype": "float32"}, {"name": "1297", "dtype": "float32"}, {"name": "1298", "dtype": "float32"}, {"name": "1299", "dtype": "float32"}, {"name": "1300", "dtype": "float32"}, {"name": "1301", "dtype": "float32"}, {"name": "1302", "dtype": "float32"}, {"name": "1303", "dtype": "float32"}, {"name": "1304", "dtype": "float32"}, {"name": "1305", "dtype": "float32"}, {"name": "1306", "dtype": "float32"}, {"name": "1307", "dtype": "float32"}, {"name": "1308", "dtype": "float32"}, {"name": "1309", "dtype": "float32"}, {"name": "1310", "dtype": "float32"}, {"name": "1311", "dtype": "float32"}, {"name": "1312", "dtype": "float32"}, {"name": "1313", "dtype": "float32"}, {"name": "1314", "dtype": "float32"}, {"name": "1315", "dtype": "float32"}, {"name": "1316", "dtype": "float32"}, {"name": "1317", "dtype": "float32"}, {"name": "1318", "dtype": "float32"}, {"name": "1319", "dtype": "float32"}, {"name": "1320", "dtype": "float32"}, {"name": "1321", "dtype": "float32"}, {"name": "1322", "dtype": "float32"}, {"name": "1323", "dtype": "float32"}, {"name": "1324", "dtype": "float32"}, {"name": "1325", "dtype": "float32"}, {"name": "1326", "dtype": "float32"}, {"name": "1327", "dtype": "float32"}, {"name": "1328", "dtype": "float32"}, {"name": "1329", "dtype": "float32"}, {"name": "1330", "dtype": "float32"}, {"name": "1331", "dtype": "float32"}, {"name": "1332", "dtype": "float32"}, {"name": "1333", "dtype": "float32"}, {"name": "1334", "dtype": "float32"}, {"name": "1335", "dtype": "float32"}, {"name": "1336", "dtype": "float32"}, {"name": "1337", "dtype": "float32"}, {"name": "1338", "dtype": "float32"}, {"name": "1339", "dtype": "float32"}, {"name": "1340", "dtype": "float32"}, {"name": "1341", "dtype": "float32"}, {"name": "1342", "dtype": "float32"}, {"name": "1343", "dtype": "float32"}, {"name": "1344", "dtype": "float32"}, {"name": "1345", "dtype": "float32"}, {"name": "1346", "dtype": "float32"}, {"name": "1347", "dtype": "float32"}, {"name": "1348", "dtype": "float32"}, {"name": "1349", "dtype": "float32"}, {"name": "1350", "dtype": "float32"}, {"name": "1351", "dtype": "float32"}, {"name": "1352", "dtype": "float32"}, {"name": "1353", "dtype": "float32"}, {"name": "1354", "dtype": "float32"}, {"name": "1355", "dtype": "float32"}, {"name": "1356", "dtype": "float32"}, {"name": "1357", "dtype": "float32"}, {"name": "1358", "dtype": "float32"}, {"name": "1359", "dtype": "float32"}, {"name": "1360", "dtype": "float32"}, {"name": "1361", "dtype": "float32"}, {"name": "1362", "dtype": "float32"}, {"name": "1363", "dtype": "float32"}, {"name": "1364", "dtype": "float32"}, {"name": "1365", "dtype": "float32"}, {"name": "1366", "dtype": "float32"}, {"name": "1367", "dtype": "float32"}, {"name": "1368", "dtype": "float32"}, {"name": "1369", "dtype": "float32"}, {"name": "1370", "dtype": "float32"}, {"name": "1371", "dtype": "float32"}, {"name": "1372", "dtype": "float32"}, {"name": "1373", "dtype": "float32"}, {"name": "1374", "dtype": "float32"}, {"name": "1375", "dtype": "float32"}, {"name": "1376", "dtype": "float32"}, {"name": "1377", "dtype": "float32"}, {"name": "1378", "dtype": "float32"}, {"name": "1379", "dtype": "float32"}, {"name": "1380", "dtype": "float32"}, {"name": "1381", "dtype": "float32"}, {"name": "1382", "dtype": "float32"}, {"name": "1383", "dtype": "float32"}, {"name": "1384", "dtype": "float32"}, {"name": "1385", "dtype": "float32"}, {"name": "1386", "dtype": "float32"}, {"name": "1387", "dtype": "float32"}, {"name": "1388", "dtype": "float32"}, {"name": "1389", "dtype": "float32"}, {"name": "1390", "dtype": "float32"}, {"name": "1391", "dtype": "float32"}, {"name": "1392", "dtype": "float32"}, {"name": "1393", "dtype": "float32"}, {"name": "1394", "dtype": "float32"}, {"name": "1395", "dtype": "float32"}, {"name": "1396", "dtype": "float32"}, {"name": "1397", "dtype": "float32"}, {"name": "1398", "dtype": "float32"}, {"name": "1399", "dtype": "float32"}, {"name": "1400", "dtype": "float32"}, {"name": "1401", "dtype": "float32"}, {"name": "1402", "dtype": "float32"}, {"name": "1403", "dtype": "float32"}, {"name": "1404", "dtype": "float32"}, {"name": "1405", "dtype": "float32"}, {"name": "1406", "dtype": "float32"}, {"name": "1407", "dtype": "float32"}, {"name": "1408", "dtype": "float32"}, {"name": "1409", "dtype": "float32"}, {"name": "1410", "dtype": "float32"}, {"name": "1411", "dtype": "float32"}, {"name": "1412", "dtype": "float32"}, {"name": "1413", "dtype": "float32"}, {"name": "1414", "dtype": "float32"}, {"name": "1415", "dtype": "float32"}, {"name": "1416", "dtype": "float32"}, {"name": "1417", "dtype": "float32"}, {"name": "1418", "dtype": "float32"}, {"name": "1419", "dtype": "float32"}, {"name": "1420", "dtype": "float32"}, {"name": "1421", "dtype": "float32"}, {"name": "1422", "dtype": "float32"}, {"name": "1423", "dtype": "float32"}, {"name": "1424", "dtype": "float32"}, {"name": "1425", "dtype": "float32"}, {"name": "1426", "dtype": "float32"}, {"name": "1427", "dtype": "float32"}, {"name": "1428", "dtype": "float32"}, {"name": "1429", "dtype": "float32"}, {"name": "1430", "dtype": "float32"}, {"name": "1431", "dtype": "float32"}, {"name": "1432", "dtype": "float32"}, {"name": "1433", "dtype": "float32"}, {"name": "1434", "dtype": "float32"}, {"name": "1435", "dtype": "float32"}, {"name": "1436", "dtype": "float32"}, {"name": "1437", "dtype": "float32"}, {"name": "1438", "dtype": "float32"}, {"name": "1439", "dtype": "float32"}, {"name": "1440", "dtype": "float32"}, {"name": "1441", "dtype": "float32"}, {"name": "1442", "dtype": "float32"}, {"name": "1443", "dtype": "float32"}, {"name": "1444", "dtype": "float32"}, {"name": "1445", "dtype": "float32"}, {"name": "1446", "dtype": "float32"}, {"name": "1447", "dtype": "float32"}, {"name": "1448", "dtype": "float32"}, {"name": "1449", "dtype": "float32"}, {"name": "1450", "dtype": "float32"}, {"name": "1451", "dtype": "float32"}, {"name": "1452", "dtype": "float32"}, {"name": "1453", "dtype": "float32"}, {"name": "1454", "dtype": "float32"}, {"name": "1455", "dtype": "float32"}, {"name": "1456", "dtype": "float32"}, {"name": "1457", "dtype": "float32"}, {"name": "1458", "dtype": "float32"}, {"name": "1459", "dtype": "float32"}, {"name": "1460", "dtype": "float32"}, {"name": "1461", "dtype": "float32"}, {"name": "1462", "dtype": "float32"}, {"name": "1463", "dtype": "float32"}, {"name": "1464", "dtype": "float32"}, {"name": "1465", "dtype": "float32"}, {"name": "1466", "dtype": "float32"}, {"name": "1467", "dtype": "float32"}, {"name": "1468", "dtype": "float32"}, {"name": "1469", "dtype": "float32"}, {"name": "1470", "dtype": "float32"}, {"name": "1471", "dtype": "float32"}, {"name": "1472", "dtype": "float32"}, {"name": "1473", "dtype": "float32"}, {"name": "1474", "dtype": "float32"}, {"name": "1475", "dtype": "float32"}, {"name": "1476", "dtype": "float32"}, {"name": "1477", "dtype": "float32"}, {"name": "1478", "dtype": "float32"}, {"name": "1479", "dtype": "float32"}, {"name": "1480", "dtype": "float32"}, {"name": "1481", "dtype": "float32"}, {"name": "1482", "dtype": "float32"}, {"name": "1483", "dtype": "float32"}, {"name": "1484", "dtype": "float32"}, {"name": "1485", "dtype": "float32"}, {"name": "1486", "dtype": "float32"}, {"name": "1487", "dtype": "float32"}, {"name": "1488", "dtype": "float32"}, {"name": "1489", "dtype": "float32"}, {"name": "1490", "dtype": "float32"}, {"name": "1491", "dtype": "float32"}, {"name": "1492", "dtype": "float32"}, {"name": "1493", "dtype": "float32"}, {"name": "1494", "dtype": "float32"}, {"name": "1495", "dtype": "float32"}, {"name": "1496", "dtype": "float32"}, {"name": "1497", "dtype": "float32"}, {"name": "1498", "dtype": "float32"}, {"name": "1499", "dtype": "float32"}, {"name": "1500", "dtype": "float32"}, {"name": "1501", "dtype": "float32"}, {"name": "1502", "dtype": "float32"}, {"name": "1503", "dtype": "float32"}, {"name": "1504", "dtype": "float32"}, {"name": "1505", "dtype": "float32"}, {"name": "1506", "dtype": "float32"}, {"name": "1507", "dtype": "float32"}, {"name": "1508", "dtype": "float32"}, {"name": "1509", "dtype": "float32"}, {"name": "1510", "dtype": "float32"}, {"name": "1511", "dtype": "float32"}, {"name": "1512", "dtype": "float32"}, {"name": "1513", "dtype": "float32"}, {"name": "1514", "dtype": "float32"}, {"name": "1515", "dtype": "float32"}, {"name": "1516", "dtype": "float32"}, {"name": "1517", "dtype": "float32"}, {"name": "1518", "dtype": "float32"}, {"name": "1519", "dtype": "float32"}, {"name": "1520", "dtype": "float32"}, {"name": "1521", "dtype": "float32"}, {"name": "1522", "dtype": "float32"}, {"name": "1523", "dtype": "float32"}, {"name": "1524", "dtype": "float32"}, {"name": "1525", "dtype": "float32"}, {"name": "1526", "dtype": "float32"}, {"name": "1527", "dtype": "float32"}, {"name": "1528", "dtype": "float32"}, {"name": "1529", "dtype": "float32"}, {"name": "1530", "dtype": "float32"}, {"name": "1531", "dtype": "float32"}, {"name": "1532", "dtype": "float32"}, {"name": "1533", "dtype": "float32"}, {"name": "1534", "dtype": "float32"}, {"name": "1535", "dtype": "float32"}, {"name": "1536", "dtype": "float32"}, {"name": "1537", "dtype": "float32"}, {"name": "1538", "dtype": "float32"}, {"name": "1539", "dtype": "float32"}, {"name": "1540", "dtype": "float32"}, {"name": "1541", "dtype": "float32"}, {"name": "1542", "dtype": "float32"}, {"name": "1543", "dtype": "float32"}, {"name": "1544", "dtype": "float32"}, {"name": "1545", "dtype": "float32"}, {"name": "1546", "dtype": "float32"}, {"name": "1547", "dtype": "float32"}, {"name": "1548", "dtype": "float32"}, {"name": "1549", "dtype": "float32"}, {"name": "1550", "dtype": "float32"}, {"name": "1551", "dtype": "float32"}, {"name": "1552", "dtype": "float32"}, {"name": "1553", "dtype": "float32"}, {"name": "1554", "dtype": "float32"}, {"name": "1555", "dtype": "float32"}, {"name": "1556", "dtype": "float32"}, {"name": "1557", "dtype": "float32"}, {"name": "1558", "dtype": "float32"}, {"name": "1559", "dtype": "float32"}, {"name": "1560", "dtype": "float32"}, {"name": "1561", "dtype": "float32"}, {"name": "1562", "dtype": "float32"}, {"name": "1563", "dtype": "float32"}, {"name": "1564", "dtype": "float32"}, {"name": "1565", "dtype": "float32"}, {"name": "1566", "dtype": "float32"}, {"name": "1567", "dtype": "float32"}, {"name": "1568", "dtype": "float32"}, {"name": "1569", "dtype": "float32"}, {"name": "1570", "dtype": "float32"}, {"name": "1571", "dtype": "float32"}, {"name": "1572", "dtype": "float32"}, {"name": "1573", "dtype": "float32"}, {"name": "1574", "dtype": "float32"}, {"name": "1575", "dtype": "float32"}, {"name": "1576", "dtype": "float32"}, {"name": "1577", "dtype": "float32"}, {"name": "1578", "dtype": "float32"}, {"name": "1579", "dtype": "float32"}, {"name": "1580", "dtype": "float32"}, {"name": "1581", "dtype": "float32"}, {"name": "1582", "dtype": "float32"}, {"name": "1583", "dtype": "float32"}, {"name": "1584", "dtype": "float32"}, {"name": "1585", "dtype": "float32"}, {"name": "1586", "dtype": "float32"}, {"name": "1587", "dtype": "float32"}, {"name": "1588", "dtype": "float32"}, {"name": "1589", "dtype": "float32"}, {"name": "1590", "dtype": "float32"}, {"name": "1591", "dtype": "float32"}, {"name": "1592", "dtype": "float32"}, {"name": "1593", "dtype": "float32"}, {"name": "1594", "dtype": "float32"}, {"name": "1595", "dtype": "float32"}, {"name": "1596", "dtype": "float32"}, {"name": "1597", "dtype": "float32"}, {"name": "1598", "dtype": "float32"}, {"name": "1599", "dtype": "float32"}, {"name": "1600", "dtype": "float32"}, {"name": "1601", "dtype": "float32"}, {"name": "1602", "dtype": "float32"}, {"name": "1603", "dtype": "float32"}, {"name": "1604", "dtype": "float32"}, {"name": "1605", "dtype": "float32"}, {"name": "1606", "dtype": "float32"}, {"name": "1607", "dtype": "float32"}, {"name": "1608", "dtype": "float32"}, {"name": "1609", "dtype": "float32"}, {"name": "1610", "dtype": "float32"}, {"name": "1611", "dtype": "float32"}, {"name": "1612", "dtype": "float32"}, {"name": "1613", "dtype": "float32"}, {"name": "1614", "dtype": "float32"}, {"name": "1615", "dtype": "float32"}, {"name": "1616", "dtype": "float32"}, {"name": "1617", "dtype": "float32"}, {"name": "1618", "dtype": "float32"}, {"name": "1619", "dtype": "float32"}, {"name": "1620", "dtype": "float32"}, {"name": "1621", "dtype": "float32"}, {"name": "1622", "dtype": "float32"}, {"name": "1623", "dtype": "float32"}, {"name": "1624", "dtype": "float32"}, {"name": "1625", "dtype": "float32"}, {"name": "1626", "dtype": "float32"}, {"name": "1627", "dtype": "float32"}, {"name": "1628", "dtype": "float32"}, {"name": "1629", "dtype": "float32"}, {"name": "1630", "dtype": "float32"}, {"name": "1631", "dtype": "float32"}, {"name": "1632", "dtype": "float32"}, {"name": "1633", "dtype": "float32"}, {"name": "1634", "dtype": "float32"}, {"name": "1635", "dtype": "float32"}, {"name": "1636", "dtype": "float32"}, {"name": "1637", "dtype": "float32"}, {"name": "1638", "dtype": "float32"}, {"name": "1639", "dtype": "float32"}, {"name": "1640", "dtype": "float32"}, {"name": "1641", "dtype": "float32"}, {"name": "1642", "dtype": "float32"}, {"name": "1643", "dtype": "float32"}, {"name": "1644", "dtype": "float32"}, {"name": "1645", "dtype": "float32"}, {"name": "1646", "dtype": "float32"}, {"name": "1647", "dtype": "float32"}, {"name": "1648", "dtype": "float32"}, {"name": "1649", "dtype": "float32"}, {"name": "1650", "dtype": "float32"}, {"name": "1651", "dtype": "float32"}, {"name": "1652", "dtype": "float32"}, {"name": "1653", "dtype": "float32"}, {"name": "1654", "dtype": "float32"}, {"name": "1655", "dtype": "float32"}, {"name": "1656", "dtype": "float32"}, {"name": "1657", "dtype": "float32"}, {"name": "1658", "dtype": "float32"}, {"name": "1659", "dtype": "float32"}, {"name": "1660", "dtype": "float32"}, {"name": "1661", "dtype": "float32"}, {"name": "1662", "dtype": "float32"}, {"name": "1663", "dtype": "float32"}, {"name": "1664", "dtype": "float32"}, {"name": "1665", "dtype": "float32"}, {"name": "1666", "dtype": "float32"}, {"name": "1667", "dtype": "float32"}, {"name": "1668", "dtype": "float32"}, {"name": "1669", "dtype": "float32"}, {"name": "1670", "dtype": "float32"}, {"name": "1671", "dtype": "float32"}, {"name": "1672", "dtype": "float32"}, {"name": "1673", "dtype": "float32"}, {"name": "1674", "dtype": "float32"}, {"name": "1675", "dtype": "float32"}, {"name": "1676", "dtype": "float32"}, {"name": "1677", "dtype": "float32"}, {"name": "1678", "dtype": "float32"}, {"name": "1679", "dtype": "float32"}, {"name": "1680", "dtype": "float32"}, {"name": "1681", "dtype": "float32"}, {"name": "1682", "dtype": "float32"}, {"name": "1683", "dtype": "float32"}, {"name": "1684", "dtype": "float32"}, {"name": "1685", "dtype": "float32"}, {"name": "1686", "dtype": "float32"}, {"name": "1687", "dtype": "float32"}, {"name": "1688", "dtype": "float32"}, {"name": "1689", "dtype": "float32"}, {"name": "1690", "dtype": "float32"}, {"name": "1691", "dtype": "float32"}, {"name": "1692", "dtype": "float32"}, {"name": "1693", "dtype": "float32"}, {"name": "1694", "dtype": "float32"}, {"name": "1695", "dtype": "float32"}, {"name": "1696", "dtype": "float32"}, {"name": "1697", "dtype": "float32"}, {"name": "1698", "dtype": "float32"}, {"name": "1699", "dtype": "float32"}, {"name": "1700", "dtype": "float32"}, {"name": "1701", "dtype": "float32"}, {"name": "1702", "dtype": "float32"}, {"name": "1703", "dtype": "float32"}, {"name": "1704", "dtype": "float32"}, {"name": "1705", "dtype": "float32"}, {"name": "1706", "dtype": "float32"}, {"name": "1707", "dtype": "float32"}, {"name": "1708", "dtype": "float32"}, {"name": "1709", "dtype": "float32"}, {"name": "1710", "dtype": "float32"}, {"name": "1711", "dtype": "float32"}, {"name": "1712", "dtype": "float32"}, {"name": "1713", "dtype": "float32"}, {"name": "1714", "dtype": "float32"}, {"name": "1715", "dtype": "float32"}, {"name": "1716", "dtype": "float32"}, {"name": "1717", "dtype": "float32"}, {"name": "1718", "dtype": "float32"}, {"name": "1719", "dtype": "float32"}, {"name": "1720", "dtype": "float32"}, {"name": "1721", "dtype": "float32"}, {"name": "1722", "dtype": "float32"}, {"name": "1723", "dtype": "float32"}, {"name": "1724", "dtype": "float32"}, {"name": "1725", "dtype": "float32"}, {"name": "1726", "dtype": "float32"}, {"name": "1727", "dtype": "float32"}, {"name": "1728", "dtype": "float32"}, {"name": "1729", "dtype": "float32"}, {"name": "1730", "dtype": "float32"}, {"name": "1731", "dtype": "float32"}, {"name": "1732", "dtype": "float32"}, {"name": "1733", "dtype": "float32"}, {"name": "1734", "dtype": "float32"}, {"name": "1735", "dtype": "float32"}, {"name": "1736", "dtype": "float32"}, {"name": "1737", "dtype": "float32"}, {"name": "1738", "dtype": "float32"}, {"name": "1739", "dtype": "float32"}, {"name": "1740", "dtype": "float32"}, {"name": "1741", "dtype": "float32"}, {"name": "1742", "dtype": "float32"}, {"name": "1743", "dtype": "float32"}, {"name": "1744", "dtype": "float32"}, {"name": "1745", "dtype": "float32"}, {"name": "1746", "dtype": "float32"}, {"name": "1747", "dtype": "float32"}, {"name": "1748", "dtype": "float32"}, {"name": "1749", "dtype": "float32"}, {"name": "1750", "dtype": "float32"}, {"name": "1751", "dtype": "float32"}, {"name": "1752", "dtype": "float32"}, {"name": "1753", "dtype": "float32"}, {"name": "1754", "dtype": "float32"}, {"name": "1755", "dtype": "float32"}, {"name": "1756", "dtype": "float32"}, {"name": "1757", "dtype": "float32"}, {"name": "1758", "dtype": "float32"}, {"name": "1759", "dtype": "float32"}, {"name": "1760", "dtype": "float32"}, {"name": "1761", "dtype": "float32"}, {"name": "1762", "dtype": "float32"}, {"name": "1763", "dtype": "float32"}, {"name": "1764", "dtype": "float32"}, {"name": "1765", "dtype": "float32"}, {"name": "1766", "dtype": "float32"}, {"name": "1767", "dtype": "float32"}, {"name": "1768", "dtype": "float32"}, {"name": "1769", "dtype": "float32"}, {"name": "1770", "dtype": "float32"}, {"name": "1771", "dtype": "float32"}, {"name": "1772", "dtype": "float32"}, {"name": "1773", "dtype": "float32"}, {"name": "1774", "dtype": "float32"}, {"name": "1775", "dtype": "float32"}, {"name": "1776", "dtype": "float32"}, {"name": "1777", "dtype": "float32"}, {"name": "1778", "dtype": "float32"}, {"name": "1779", "dtype": "float32"}, {"name": "1780", "dtype": "float32"}, {"name": "1781", "dtype": "float32"}, {"name": "1782", "dtype": "float32"}, {"name": "1783", "dtype": "float32"}, {"name": "1784", "dtype": "float32"}, {"name": "1785", "dtype": "float32"}, {"name": "1786", "dtype": "float32"}, {"name": "1787", "dtype": "float32"}, {"name": "1788", "dtype": "float32"}, {"name": "1789", "dtype": "float32"}, {"name": "1790", "dtype": "float32"}, {"name": "1791", "dtype": "float32"}, {"name": "1792", "dtype": "float32"}, {"name": "1793", "dtype": "float32"}, {"name": "1794", "dtype": "float32"}, {"name": "1795", "dtype": "float32"}, {"name": "1796", "dtype": "float32"}, {"name": "1797", "dtype": "float32"}, {"name": "1798", "dtype": "float32"}, {"name": "1799", "dtype": "float32"}, {"name": "1800", "dtype": "float32"}, {"name": "1801", "dtype": "float32"}, {"name": "1802", "dtype": "float32"}, {"name": "1803", "dtype": "float32"}, {"name": "1804", "dtype": "float32"}, {"name": "1805", "dtype": "float32"}, {"name": "1806", "dtype": "float32"}, {"name": "1807", "dtype": "float32"}, {"name": "1808", "dtype": "float32"}, {"name": "1809", "dtype": "float32"}, {"name": "1810", "dtype": "float32"}, {"name": "1811", "dtype": "float32"}, {"name": "1812", "dtype": "float32"}, {"name": "1813", "dtype": "float32"}, {"name": "1814", "dtype": "float32"}, {"name": "1815", "dtype": "float32"}, {"name": "1816", "dtype": "float32"}, {"name": "1817", "dtype": "float32"}, {"name": "1818", "dtype": "float32"}, {"name": "1819", "dtype": "float32"}, {"name": "1820", "dtype": "float32"}, {"name": "1821", "dtype": "float32"}, {"name": "1822", "dtype": "float32"}, {"name": "1823", "dtype": "float32"}, {"name": "1824", "dtype": "float32"}, {"name": "1825", "dtype": "float32"}, {"name": "1826", "dtype": "float32"}, {"name": "1827", "dtype": "float32"}, {"name": "1828", "dtype": "float32"}, {"name": "1829", "dtype": "float32"}, {"name": "1830", "dtype": "float32"}, {"name": "1831", "dtype": "float32"}, {"name": "1832", "dtype": "float32"}, {"name": "1833", "dtype": "float32"}, {"name": "1834", "dtype": "float32"}, {"name": "1835", "dtype": "float32"}, {"name": "1836", "dtype": "float32"}, {"name": "1837", "dtype": "float32"}, {"name": "1838", "dtype": "float32"}, {"name": "1839", "dtype": "float32"}, {"name": "1840", "dtype": "float32"}, {"name": "1841", "dtype": "float32"}, {"name": "1842", "dtype": "float32"}, {"name": "1843", "dtype": "float32"}, {"name": "1844", "dtype": "float32"}, {"name": "1845", "dtype": "float32"}, {"name": "1846", "dtype": "float32"}, {"name": "1847", "dtype": "float32"}, {"name": "1848", "dtype": "float32"}, {"name": "1849", "dtype": "float32"}, {"name": "1850", "dtype": "float32"}, {"name": "1851", "dtype": "float32"}, {"name": "1852", "dtype": "float32"}, {"name": "1853", "dtype": "float32"}, {"name": "1854", "dtype": "float32"}, {"name": "1855", "dtype": "float32"}, {"name": "1856", "dtype": "float32"}, {"name": "1857", "dtype": "float32"}, {"name": "1858", "dtype": "float32"}, {"name": "1859", "dtype": "float32"}, {"name": "1860", "dtype": "float32"}, {"name": "1861", "dtype": "float32"}, {"name": "1862", "dtype": "float32"}, {"name": "1863", "dtype": "float32"}, {"name": "1864", "dtype": "float32"}, {"name": "1865", "dtype": "float32"}, {"name": "1866", "dtype": "float32"}, {"name": "1867", "dtype": "float32"}, {"name": "1868", "dtype": "float32"}, {"name": "1869", "dtype": "float32"}, {"name": "1870", "dtype": "float32"}, {"name": "1871", "dtype": "float32"}, {"name": "1872", "dtype": "float32"}, {"name": "1873", "dtype": "float32"}, {"name": "1874", "dtype": "float32"}, {"name": "1875", "dtype": "float32"}, {"name": "1876", "dtype": "float32"}, {"name": "1877", "dtype": "float32"}, {"name": "1878", "dtype": "float32"}, {"name": "1879", "dtype": "float32"}, {"name": "1880", "dtype": "float32"}, {"name": "1881", "dtype": "float32"}, {"name": "1882", "dtype": "float32"}, {"name": "1883", "dtype": "float32"}, {"name": "1884", "dtype": "float32"}, {"name": "1885", "dtype": "float32"}, {"name": "1886", "dtype": "float32"}, {"name": "1887", "dtype": "float32"}, {"name": "1888", "dtype": "float32"}, {"name": "1889", "dtype": "float32"}, {"name": "1890", "dtype": "float32"}, {"name": "1891", "dtype": "float32"}, {"name": "1892", "dtype": "float32"}, {"name": "1893", "dtype": "float32"}, {"name": "1894", "dtype": "float32"}, {"name": "1895", "dtype": "float32"}, {"name": "1896", "dtype": "float32"}, {"name": "1897", "dtype": "float32"}, {"name": "1898", "dtype": "float32"}, {"name": "1899", "dtype": "float32"}, {"name": "1900", "dtype": "float32"}, {"name": "1901", "dtype": "float32"}, {"name": "1902", "dtype": "float32"}, {"name": "1903", "dtype": "float32"}, {"name": "1904", "dtype": "float32"}, {"name": "1905", "dtype": "float32"}, {"name": "1906", "dtype": "float32"}, {"name": "1907", "dtype": "float32"}, {"name": "1908", "dtype": "float32"}, {"name": "1909", "dtype": "float32"}, {"name": "1910", "dtype": "float32"}, {"name": "1911", "dtype": "float32"}, {"name": "1912", "dtype": "float32"}, {"name": "1913", "dtype": "float32"}, {"name": "1914", "dtype": "float32"}, {"name": "1915", "dtype": "float32"}, {"name": "1916", "dtype": "float32"}, {"name": "1917", "dtype": "float32"}, {"name": "1918", "dtype": "float32"}, {"name": "1919", "dtype": "float32"}, {"name": "1920", "dtype": "float32"}, {"name": "1921", "dtype": "float32"}, {"name": "1922", "dtype": "float32"}, {"name": "1923", "dtype": "float32"}, {"name": "1924", "dtype": "float32"}, {"name": "1925", "dtype": "float32"}, {"name": "1926", "dtype": "float32"}, {"name": "1927", "dtype": "float32"}, {"name": "1928", "dtype": "float32"}, {"name": "1929", "dtype": "float32"}, {"name": "1930", "dtype": "float32"}, {"name": "1931", "dtype": "float32"}, {"name": "1932", "dtype": "float32"}, {"name": "1933", "dtype": "float32"}, {"name": "1934", "dtype": "float32"}, {"name": "1935", "dtype": "float32"}, {"name": "1936", "dtype": "float32"}, {"name": "1937", "dtype": "float32"}, {"name": "1938", "dtype": "float32"}, {"name": "1939", "dtype": "float32"}, {"name": "1940", "dtype": "float32"}, {"name": "1941", "dtype": "float32"}, {"name": "1942", "dtype": "float32"}, {"name": "1943", "dtype": "float32"}, {"name": "1944", "dtype": "float32"}, {"name": "1945", "dtype": "float32"}, {"name": "1946", "dtype": "float32"}, {"name": "1947", "dtype": "float32"}, {"name": "1948", "dtype": "float32"}, {"name": "1949", "dtype": "float32"}, {"name": "1950", "dtype": "float32"}, {"name": "1951", "dtype": "float32"}, {"name": "1952", "dtype": "float32"}, {"name": "1953", "dtype": "float32"}, {"name": "1954", "dtype": "float32"}, {"name": "1955", "dtype": "float32"}, {"name": "1956", "dtype": "float32"}, {"name": "1957", "dtype": "float32"}, {"name": "1958", "dtype": "float32"}, {"name": "1959", "dtype": "float32"}, {"name": "1960", "dtype": "float32"}, {"name": "1961", "dtype": "float32"}, {"name": "1962", "dtype": "float32"}, {"name": "1963", "dtype": "float32"}, {"name": "1964", "dtype": "float32"}, {"name": "1965", "dtype": "float32"}, {"name": "1966", "dtype": "float32"}, {"name": "1967", "dtype": "float32"}, {"name": "1968", "dtype": "float32"}, {"name": "1969", "dtype": "float32"}, {"name": "1970", "dtype": "float32"}, {"name": "1971", "dtype": "float32"}, {"name": "1972", "dtype": "float32"}, {"name": "1973", "dtype": "float32"}, {"name": "1974", "dtype": "float32"}, {"name": "1975", "dtype": "float32"}, {"name": "1976", "dtype": "float32"}, {"name": "1977", "dtype": "float32"}, {"name": "1978", "dtype": "float32"}, {"name": "1979", "dtype": "float32"}, {"name": "1980", "dtype": "float32"}, {"name": "1981", "dtype": "float32"}, {"name": "1982", "dtype": "float32"}, {"name": "1983", "dtype": "float32"}, {"name": "1984", "dtype": "float32"}, {"name": "1985", "dtype": "float32"}, {"name": "1986", "dtype": "float32"}, {"name": "1987", "dtype": "float32"}, {"name": "1988", "dtype": "float32"}, {"name": "1989", "dtype": "float32"}, {"name": "1990", "dtype": "float32"}, {"name": "1991", "dtype": "float32"}, {"name": "1992", "dtype": "float32"}, {"name": "1993", "dtype": "float32"}, {"name": "1994", "dtype": "float32"}, {"name": "1995", "dtype": "float32"}, {"name": "1996", "dtype": "float32"}, {"name": "1997", "dtype": "float32"}, {"name": "1998", "dtype": "float32"}, {"name": "1999", "dtype": "float32"}, {"name": "2000", "dtype": "float32"}, {"name": "2001", "dtype": "float32"}, {"name": "2002", "dtype": "float32"}, {"name": "2003", "dtype": "float32"}, {"name": "2004", "dtype": "float32"}, {"name": "2005", "dtype": "float32"}, {"name": "2006", "dtype": "float32"}, {"name": "2007", "dtype": "float32"}, {"name": "2008", "dtype": "float32"}, {"name": "2009", "dtype": "float32"}, {"name": "2010", "dtype": "float32"}, {"name": "2011", "dtype": "float32"}, {"name": "2012", "dtype": "float32"}, {"name": "2013", "dtype": "float32"}, {"name": "2014", "dtype": "float32"}, {"name": "2015", "dtype": "float32"}, {"name": "2016", "dtype": "float32"}, {"name": "2017", "dtype": "float32"}, {"name": "2018", "dtype": "float32"}, {"name": "2019", "dtype": "float32"}, {"name": "2020", "dtype": "float32"}, {"name": "2021", "dtype": "float32"}, {"name": "2022", "dtype": "float32"}, {"name": "2023", "dtype": "float32"}, {"name": "2024", "dtype": "float32"}, {"name": "2025", "dtype": "float32"}, {"name": "2026", "dtype": "float32"}, {"name": "2027", "dtype": "float32"}, {"name": "2028", "dtype": "float32"}, {"name": "2029", "dtype": "float32"}, {"name": "2030", "dtype": "float32"}, {"name": "2031", "dtype": "float32"}, {"name": "2032", "dtype": "float32"}, {"name": "2033", "dtype": "float32"}, {"name": "2034", "dtype": "float32"}, {"name": "2035", "dtype": "float32"}, {"name": "2036", "dtype": "float32"}, {"name": "2037", "dtype": "float32"}, {"name": "2038", "dtype": "float32"}, {"name": "2039", "dtype": "float32"}, {"name": "2040", "dtype": "float32"}, {"name": "2041", "dtype": "float32"}, {"name": "2042", "dtype": "float32"}, {"name": "2043", "dtype": "float32"}, {"name": "2044", "dtype": "float32"}, {"name": "2045", "dtype": "float32"}, {"name": "2046", "dtype": "float32"}, {"name": "2047", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 307608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 102536305.0, "num_examples": 12500}], "download_size": 565384532, "dataset_size": 410145212.5}}
|
2023-08-18T00:46:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_GPTNEO_Baseline"
More Information needed
|
[
"# Dataset Card for \"PKDD_GPTNEO_Baseline\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_GPTNEO_Baseline\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_GPTNEO_Baseline\"\n\nMore Information needed"
] |
aeabd9e70be28d384c2ad7b40fd36c6ef5eb15e9
|
# Dataset Card for CNNovel125K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- **Homepage:** (TODO)
- **Repository:** <https://github.com/RyokoAI/BigKnow2022>
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** Ronsor/undeleted <[email protected]>
### Dataset Summary
CNNovel125K is a dataset composed of approximately 125,000 novels downloaded from the Chinese novel hosting site <http://ibiquw.com>.
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* Simplified Chinese
## Dataset Structure
### Data Instances
```json
{
"text": "\n------------\n\n全部章节\n\n\n------------\n\n第一章 她肯定做梦呢!\n\n HT国际大酒店总统套房。\n\n 清晨的第一缕阳光照射进圣地亚哥地板上,洒落在凌乱的床单上,突然地,床上睡的正熟的人睁开眼睛,
猛然惊醒!\n\n ...",
"meta": {
"subset": "cnnovel.ibiquw",
"id": "100067",
"q": 0.9,
"lang": "zh_cn",
"title": "为爱入局:嫁给秦先生",
"author": "奥德萨"
}
}
{
"text": "\n------------\n\n全部章节\n\n\n------------\n\n第1章:出狱就大婚\n\n 凉城第一监狱,大门缓缓打开,秦峰仰起头,贪婪的呼吸了一口空气。\n\n 三年了,终于又闻到了自由的味道。\n\n 他回过头,看着目
送他出来的那群人道:...",
"meta": {
"subset": "cnnovel.ibiquw",
"id": "100059",
"q": 0.9,
"lang": "zh_cn",
"title": "绝世弃婿",
"author": "绷带怪"
}
}
```
### Data Fields
* `text`: the actual novel text, all chapters
* `meta`: entry metadata
* `subset`: dataset tag: `cnnovel.ibiquw`
* `id`: novel ID
* `q`: quality score, fixed at 0.9
* `lang`: always `zh_cn` (Simplified Chinese)
* `title`: novel title
* `author`: novel author
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
TODO
### Source Data
#### Initial Data Collection and Normalization
TODO
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Titles were collected alongside the novel text and IDs.
#### Who are the annotators?
There were no human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content in Chinese.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. Beware of stereotypes.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Citation Information
```
@misc{ryokoai2023-bigknow2022,
title = {BigKnow2022: Bringing Language Models Up to Speed},
author = {Ronsor},
year = {2023},
howpublished = {\url{https://github.com/RyokoAI/BigKnow2022}},
}
```
### Contributions
Thanks to @ronsor (GH) for gathering this dataset.
|
botp/RyokoAI_CNNovel125K
|
[
"task_categories:text-classification",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:zh",
"license:apache-2.0",
"novel",
"training",
"region:us"
] |
2023-08-18T00:31:26+00:00
|
{"language": ["zh"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-classification", "text-generation"], "pretty_name": "CNNovel125K", "tags": ["novel", "training"], "duplicated_from": "RyokoAI/CNNovel125K"}
|
2023-08-18T00:31:26+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-Chinese #license-apache-2.0 #novel #training #region-us
|
# Dataset Card for CNNovel125K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- Homepage: (TODO)
- Repository: <URL
- Paper: N/A
- Leaderboard: N/A
- Point of Contact: Ronsor/undeleted <ronsor@URL>
### Dataset Summary
CNNovel125K is a dataset composed of approximately 125,000 novels downloaded from the Chinese novel hosting site <URL>.
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* Simplified Chinese
## Dataset Structure
### Data Instances
### Data Fields
* 'text': the actual novel text, all chapters
* 'meta': entry metadata
* 'subset': dataset tag: 'URL'
* 'id': novel ID
* 'q': quality score, fixed at 0.9
* 'lang': always 'zh_cn' (Simplified Chinese)
* 'title': novel title
* 'author': novel author
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
TODO
### Source Data
#### Initial Data Collection and Normalization
TODO
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Titles were collected alongside the novel text and IDs.
#### Who are the annotators?
There were no human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content in Chinese.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. Beware of stereotypes.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Contributions
Thanks to @ronsor (GH) for gathering this dataset.
|
[
"# Dataset Card for CNNovel125K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>",
"### Dataset Summary\n\nCNNovel125K is a dataset composed of approximately 125,000 novels downloaded from the Chinese novel hosting site <URL>.",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* Simplified Chinese",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* 'text': the actual novel text, all chapters\n* 'meta': entry metadata\n * 'subset': dataset tag: 'URL'\n * 'id': novel ID\n * 'q': quality score, fixed at 0.9\n * 'lang': always 'zh_cn' (Simplified Chinese)\n * 'title': novel title\n * 'author': novel author",
"### Data Splits\n\nNo splitting of the data was performed.",
"## Dataset Creation",
"### Curation Rationale\n\nTODO",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nTODO",
"#### Who are the source language producers?\n\nThe authors of each novel.",
"### Annotations",
"#### Annotation process\n\nTitles were collected alongside the novel text and IDs.",
"#### Who are the annotators?\n\nThere were no human annotators.",
"### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content in Chinese.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect\nthe biases of those authors. Beware of stereotypes.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nRonsor Labs",
"### Licensing Information\n\nApache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is\ndistributed under fair use principles.",
"### Contributions\n\nThanks to @ronsor (GH) for gathering this dataset."
] |
[
"TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-Chinese #license-apache-2.0 #novel #training #region-us \n",
"# Dataset Card for CNNovel125K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>",
"### Dataset Summary\n\nCNNovel125K is a dataset composed of approximately 125,000 novels downloaded from the Chinese novel hosting site <URL>.",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* Simplified Chinese",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* 'text': the actual novel text, all chapters\n* 'meta': entry metadata\n * 'subset': dataset tag: 'URL'\n * 'id': novel ID\n * 'q': quality score, fixed at 0.9\n * 'lang': always 'zh_cn' (Simplified Chinese)\n * 'title': novel title\n * 'author': novel author",
"### Data Splits\n\nNo splitting of the data was performed.",
"## Dataset Creation",
"### Curation Rationale\n\nTODO",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nTODO",
"#### Who are the source language producers?\n\nThe authors of each novel.",
"### Annotations",
"#### Annotation process\n\nTitles were collected alongside the novel text and IDs.",
"#### Who are the annotators?\n\nThere were no human annotators.",
"### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content in Chinese.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect\nthe biases of those authors. Beware of stereotypes.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nRonsor Labs",
"### Licensing Information\n\nApache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is\ndistributed under fair use principles.",
"### Contributions\n\nThanks to @ronsor (GH) for gathering this dataset."
] |
[
58,
39,
48,
35,
49,
9,
6,
6,
91,
15,
5,
9,
4,
12,
17,
5,
19,
17,
30,
8,
51,
52,
10,
5,
10,
44,
21
] |
[
"passage: TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-Chinese #license-apache-2.0 #novel #training #region-us \n# Dataset Card for CNNovel125K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>### Dataset Summary\n\nCNNovel125K is a dataset composed of approximately 125,000 novels downloaded from the Chinese novel hosting site <URL>.### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation### Languages\n\n* Simplified Chinese## Dataset Structure### Data Instances### Data Fields\n\n* 'text': the actual novel text, all chapters\n* 'meta': entry metadata\n * 'subset': dataset tag: 'URL'\n * 'id': novel ID\n * 'q': quality score, fixed at 0.9\n * 'lang': always 'zh_cn' (Simplified Chinese)\n * 'title': novel title\n * 'author': novel author### Data Splits\n\nNo splitting of the data was performed.## Dataset Creation### Curation Rationale\n\nTODO### Source Data#### Initial Data Collection and Normalization\n\nTODO#### Who are the source language producers?\n\nThe authors of each novel.### Annotations#### Annotation process\n\nTitles were collected alongside the novel text and IDs.#### Who are the annotators?\n\nThere were no human annotators.### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.## Considerations for Using the Data"
] |
47e53eca13434e4bf5b181e695337173fb8b3798
|
# Dataset Card for ScribbleHub17K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- **Homepage:** (TODO)
- **Repository:** <https://github.com/RyokoAI/BigKnow2022>
- **Paper:** N/A
- **Leaderboard:** N/A
- **Point of Contact:** Ronsor/undeleted <[email protected]>
### Dataset Summary
ScribbleHub17K is a dataset consisting of text from over 373,000 chapters across approximately 17,500 series posted on the
original story sharing site [Scribble Hub](https://scribblehub.com).
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* English
## Dataset Structure
### Data Instances
```json
{
"text": " \n2082 Planet Earth the Fracture War, after a sudden fracture in our dimension unidentified beings with advance technology and u...",
"meta": {
"subset": "scribblehub",
"series": "3811",
"id": "3812",
"q": 0.91,
"title": "The First - Prologue- The Fracture War",
"author": "RobotLove",
"chapters": 1,
"rating": 5,
"rating_ct": 1,
"genre": [
"Action",
"Martial Arts",
"Romance"
],
"tags": [
"Kingdom Building",
"Loyal Subordinates",
"Male Protagonist",
"Organized Crime",
"Scheming"
]
}
}
{
"text": " For anyone that may see this, thanks for reading. I'm just here to see if a story can spill out of my mind if just start writin...",
"meta": {
"subset": "scribblehub",
"series": "586090",
"id": "586099",
"q": 0.82,
"title": "Just writing to write…i guess? - I’m here now",
"author": "BigOofStudios",
"chapters": 1,
"rating": 4.5,
"rating_ct": 2,
"genre": [
"Action",
"Comedy"
],
"tags": []
}
}
```
### Data Fields
* `text`: the actual chapter text
* `meta`: metadata for chapter and series
* `subset`: data source tag: `scribblehub`
* `series`: series ID
* `id`: chapter ID
* `lang`: always `en` (English)
* `q`: quality score (q-score) between (0.0) terrible and 1.0 (perfect); anything with a score `> 0.5` is generally good enough
* `title`: chapter and series title in the format `<chapter title> - <series title>`
* `chapters`: total number of chapters in the series
* `rating`: Scribble Hub rating between 0 and 5 stars
* `rating_ct`: number of ratings
* `author`: author name
* `genre`: array of Scribble Hub genres for the series
* `tags`: array of tags for the series
#### Q-Score Distribution
```
0.00: 0
0.10: 0
0.20: 0
0.30: 84
0.40: 718
0.50: 3775
0.60: 22300
0.70: 72581
0.80: 137982
0.90: 135800
1.00: 59
```
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
Scribble Hub is a home for original web stories, effectively a smaller, English version of Japan's Syosetuka ni Narou. As a
result, it is a good source for reasonably well written creative content.
### Source Data
#### Initial Data Collection and Normalization
TODO
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Title, ratings, and other metadata were parsed out using scripts that will be provided in the BigKnow2022 GitHub repository.
#### Who are the annotators?
No human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. **Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.**
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Citation Information
```
@misc{ryokoai2023-bigknow2022,
title = {BigKnow2022: Bringing Language Models Up to Speed},
author = {Ronsor},
year = {2023},
howpublished = {\url{https://github.com/RyokoAI/BigKnow2022}},
}
```
### Contributions
Thanks to @ronsor (GH) for gathering this dataset.
|
botp/RyokoAI_ScribbleHub17K
|
[
"task_categories:text-classification",
"task_categories:text-generation",
"size_categories:100K<n<1M",
"language:en",
"license:apache-2.0",
"novel",
"training",
"story",
"region:us"
] |
2023-08-18T00:33:10+00:00
|
{"language": ["en"], "license": "apache-2.0", "size_categories": ["100K<n<1M"], "task_categories": ["text-classification", "text-generation"], "pretty_name": "ScribbleHub17K", "tags": ["novel", "training", "story"], "duplicated_from": "RyokoAI/ScribbleHub17K"}
|
2023-08-18T00:33:10+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #novel #training #story #region-us
|
# Dataset Card for ScribbleHub17K
*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*
## Dataset Description
- Homepage: (TODO)
- Repository: <URL
- Paper: N/A
- Leaderboard: N/A
- Point of Contact: Ronsor/undeleted <ronsor@URL>
### Dataset Summary
ScribbleHub17K is a dataset consisting of text from over 373,000 chapters across approximately 17,500 series posted on the
original story sharing site Scribble Hub.
### Supported Tasks and Leaderboards
This dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.
* text-classification
* text-generation
### Languages
* English
## Dataset Structure
### Data Instances
### Data Fields
* 'text': the actual chapter text
* 'meta': metadata for chapter and series
* 'subset': data source tag: 'scribblehub'
* 'series': series ID
* 'id': chapter ID
* 'lang': always 'en' (English)
* 'q': quality score (q-score) between (0.0) terrible and 1.0 (perfect); anything with a score '> 0.5' is generally good enough
* 'title': chapter and series title in the format '<chapter title> - <series title>'
* 'chapters': total number of chapters in the series
* 'rating': Scribble Hub rating between 0 and 5 stars
* 'rating_ct': number of ratings
* 'author': author name
* 'genre': array of Scribble Hub genres for the series
* 'tags': array of tags for the series
#### Q-Score Distribution
### Data Splits
No splitting of the data was performed.
## Dataset Creation
### Curation Rationale
Scribble Hub is a home for original web stories, effectively a smaller, English version of Japan's Syosetuka ni Narou. As a
result, it is a good source for reasonably well written creative content.
### Source Data
#### Initial Data Collection and Normalization
TODO
#### Who are the source language producers?
The authors of each novel.
### Annotations
#### Annotation process
Title, ratings, and other metadata were parsed out using scripts that will be provided in the BigKnow2022 GitHub repository.
#### Who are the annotators?
No human annotators.
### Personal and Sensitive Information
The dataset contains only works of fiction, and we do not believe it contains any PII.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended to be useful for anyone who wishes to train a model to generate "more entertaining" content.
It may also be useful for other languages depending on your language model.
### Discussion of Biases
This dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect
the biases of those authors. Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.
### Other Known Limitations
N/A
## Additional Information
### Dataset Curators
Ronsor Labs
### Licensing Information
Apache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is
distributed under fair use principles.
### Contributions
Thanks to @ronsor (GH) for gathering this dataset.
|
[
"# Dataset Card for ScribbleHub17K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>",
"### Dataset Summary\n\nScribbleHub17K is a dataset consisting of text from over 373,000 chapters across approximately 17,500 series posted on the\noriginal story sharing site Scribble Hub.",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* English",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* 'text': the actual chapter text\n* 'meta': metadata for chapter and series\n * 'subset': data source tag: 'scribblehub'\n * 'series': series ID\n * 'id': chapter ID\n * 'lang': always 'en' (English)\n * 'q': quality score (q-score) between (0.0) terrible and 1.0 (perfect); anything with a score '> 0.5' is generally good enough\n * 'title': chapter and series title in the format '<chapter title> - <series title>'\n * 'chapters': total number of chapters in the series\n * 'rating': Scribble Hub rating between 0 and 5 stars\n * 'rating_ct': number of ratings\n * 'author': author name\n * 'genre': array of Scribble Hub genres for the series\n * 'tags': array of tags for the series",
"#### Q-Score Distribution",
"### Data Splits\n\nNo splitting of the data was performed.",
"## Dataset Creation",
"### Curation Rationale\n\nScribble Hub is a home for original web stories, effectively a smaller, English version of Japan's Syosetuka ni Narou. As a\nresult, it is a good source for reasonably well written creative content.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nTODO",
"#### Who are the source language producers?\n\nThe authors of each novel.",
"### Annotations",
"#### Annotation process\n\nTitle, ratings, and other metadata were parsed out using scripts that will be provided in the BigKnow2022 GitHub repository.",
"#### Who are the annotators?\n\nNo human annotators.",
"### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect\nthe biases of those authors. Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nRonsor Labs",
"### Licensing Information\n\nApache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is\ndistributed under fair use principles.",
"### Contributions\n\nThanks to @ronsor (GH) for gathering this dataset."
] |
[
"TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #novel #training #story #region-us \n",
"# Dataset Card for ScribbleHub17K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*",
"## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>",
"### Dataset Summary\n\nScribbleHub17K is a dataset consisting of text from over 373,000 chapters across approximately 17,500 series posted on the\noriginal story sharing site Scribble Hub.",
"### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation",
"### Languages\n\n* English",
"## Dataset Structure",
"### Data Instances",
"### Data Fields\n\n* 'text': the actual chapter text\n* 'meta': metadata for chapter and series\n * 'subset': data source tag: 'scribblehub'\n * 'series': series ID\n * 'id': chapter ID\n * 'lang': always 'en' (English)\n * 'q': quality score (q-score) between (0.0) terrible and 1.0 (perfect); anything with a score '> 0.5' is generally good enough\n * 'title': chapter and series title in the format '<chapter title> - <series title>'\n * 'chapters': total number of chapters in the series\n * 'rating': Scribble Hub rating between 0 and 5 stars\n * 'rating_ct': number of ratings\n * 'author': author name\n * 'genre': array of Scribble Hub genres for the series\n * 'tags': array of tags for the series",
"#### Q-Score Distribution",
"### Data Splits\n\nNo splitting of the data was performed.",
"## Dataset Creation",
"### Curation Rationale\n\nScribble Hub is a home for original web stories, effectively a smaller, English version of Japan's Syosetuka ni Narou. As a\nresult, it is a good source for reasonably well written creative content.",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nTODO",
"#### Who are the source language producers?\n\nThe authors of each novel.",
"### Annotations",
"#### Annotation process\n\nTitle, ratings, and other metadata were parsed out using scripts that will be provided in the BigKnow2022 GitHub repository.",
"#### Who are the annotators?\n\nNo human annotators.",
"### Personal and Sensitive Information\n\nThe dataset contains only works of fiction, and we do not believe it contains any PII.",
"## Considerations for Using the Data",
"### Social Impact of Dataset\n\nThis dataset is intended to be useful for anyone who wishes to train a model to generate \"more entertaining\" content.\nIt may also be useful for other languages depending on your language model.",
"### Discussion of Biases\n\nThis dataset is composed of fictional works by various authors. Because of this fact, the contents of this dataset will reflect\nthe biases of those authors. Additionally, this dataset contains NSFW material and was not filtered. Beware of stereotypes.",
"### Other Known Limitations\n\nN/A",
"## Additional Information",
"### Dataset Curators\n\nRonsor Labs",
"### Licensing Information\n\nApache 2.0, for all parts of which Ronsor Labs or the Ryoko AI Production Committee may be considered authors. All other material is\ndistributed under fair use principles.",
"### Contributions\n\nThanks to @ronsor (GH) for gathering this dataset."
] |
[
59,
40,
48,
43,
49,
6,
6,
6,
207,
8,
15,
5,
54,
4,
12,
17,
5,
38,
15,
30,
8,
49,
71,
10,
5,
10,
44,
21
] |
[
"passage: TAGS\n#task_categories-text-classification #task_categories-text-generation #size_categories-100K<n<1M #language-English #license-apache-2.0 #novel #training #story #region-us \n# Dataset Card for ScribbleHub17K\n\n*The BigKnow2022 dataset and its subsets are not yet complete. Not all information here may be accurate or accessible.*## Dataset Description\n\n- Homepage: (TODO)\n- Repository: <URL\n- Paper: N/A \n- Leaderboard: N/A\n- Point of Contact: Ronsor/undeleted <ronsor@URL>### Dataset Summary\n\nScribbleHub17K is a dataset consisting of text from over 373,000 chapters across approximately 17,500 series posted on the\noriginal story sharing site Scribble Hub.### Supported Tasks and Leaderboards\n\nThis dataset is primarily intended for unsupervised training of text generation models; however, it may be useful for other purposes.\n\n* text-classification\n* text-generation### Languages\n\n* English## Dataset Structure### Data Instances### Data Fields\n\n* 'text': the actual chapter text\n* 'meta': metadata for chapter and series\n * 'subset': data source tag: 'scribblehub'\n * 'series': series ID\n * 'id': chapter ID\n * 'lang': always 'en' (English)\n * 'q': quality score (q-score) between (0.0) terrible and 1.0 (perfect); anything with a score '> 0.5' is generally good enough\n * 'title': chapter and series title in the format '<chapter title> - <series title>'\n * 'chapters': total number of chapters in the series\n * 'rating': Scribble Hub rating between 0 and 5 stars\n * 'rating_ct': number of ratings\n * 'author': author name\n * 'genre': array of Scribble Hub genres for the series\n * 'tags': array of tags for the series#### Q-Score Distribution### Data Splits\n\nNo splitting of the data was performed.## Dataset Creation"
] |
b6eaff1dee1f8332ba14d0a6fac959d4f3c1229e
|
# ShareGPT-Chinese-English-90k 中英文双语人机问答数据集
中英文平行双语优质人机问答数据集,覆盖真实复杂场景下的用户提问。用于训练高质量的对话模型 (比那些通过反复调用api接口生成机器模拟问答的数据在指令分布上更鲁棒)
特点:
- 1.同时提供意义表达完全相同的中英文平行对照语料,可进行双语对话模型训练。
- 2.所有问题均非人为臆想加上api轮询拟造的假数据(如Moss),更加符合真实用户场景的指令分布和提问表达。
- 3.sharegpt数据集是由网友自发分享而收集到的,相当于有一层非常天然的过滤(通过人类感觉),筛除了大部分体验不好的对话。
补充:该数据收集于chatGPT还未表现出明显智力退化的时间点。(猜测一方面可能是官方为了减小开支把150B的gpt3.5替换成10b左右的蒸馏版本了,另一方面可能是由于引入了更多的拒绝答复导致模型连接知识逻辑的程度退化)
优秀对话llm的训练离不开高质量的多轮对话数据集,如果你也想成为志愿者
欢迎加入数据集QQ群:130920969,共同进行优质数据集的交流、收集和建设工作
|
botp/shareAI_ShareGPT-Chinese-English-90k
|
[
"license:apache-2.0",
"region:us"
] |
2023-08-18T00:49:07+00:00
|
{"license": "apache-2.0", "duplicated_from": "shareAI/ShareGPT-Chinese-English-90k"}
|
2023-08-18T00:49:07+00:00
|
[] |
[] |
TAGS
#license-apache-2.0 #region-us
|
# ShareGPT-Chinese-English-90k 中英文双语人机问答数据集
中英文平行双语优质人机问答数据集,覆盖真实复杂场景下的用户提问。用于训练高质量的对话模型 (比那些通过反复调用api接口生成机器模拟问答的数据在指令分布上更鲁棒)
特点:
- 1.同时提供意义表达完全相同的中英文平行对照语料,可进行双语对话模型训练。
- 2.所有问题均非人为臆想加上api轮询拟造的假数据(如Moss),更加符合真实用户场景的指令分布和提问表达。
- 3.sharegpt数据集是由网友自发分享而收集到的,相当于有一层非常天然的过滤(通过人类感觉),筛除了大部分体验不好的对话。
补充:该数据收集于chatGPT还未表现出明显智力退化的时间点。(猜测一方面可能是官方为了减小开支把150B的gpt3.5替换成10b左右的蒸馏版本了,另一方面可能是由于引入了更多的拒绝答复导致模型连接知识逻辑的程度退化)
优秀对话llm的训练离不开高质量的多轮对话数据集,如果你也想成为志愿者
欢迎加入数据集QQ群:130920969,共同进行优质数据集的交流、收集和建设工作
|
[
"# ShareGPT-Chinese-English-90k 中英文双语人机问答数据集 \n中英文平行双语优质人机问答数据集,覆盖真实复杂场景下的用户提问。用于训练高质量的对话模型 (比那些通过反复调用api接口生成机器模拟问答的数据在指令分布上更鲁棒)\n特点:\n- 1.同时提供意义表达完全相同的中英文平行对照语料,可进行双语对话模型训练。\n- 2.所有问题均非人为臆想加上api轮询拟造的假数据(如Moss),更加符合真实用户场景的指令分布和提问表达。\n- 3.sharegpt数据集是由网友自发分享而收集到的,相当于有一层非常天然的过滤(通过人类感觉),筛除了大部分体验不好的对话。 \n\n补充:该数据收集于chatGPT还未表现出明显智力退化的时间点。(猜测一方面可能是官方为了减小开支把150B的gpt3.5替换成10b左右的蒸馏版本了,另一方面可能是由于引入了更多的拒绝答复导致模型连接知识逻辑的程度退化)\n\n\n优秀对话llm的训练离不开高质量的多轮对话数据集,如果你也想成为志愿者 \n欢迎加入数据集QQ群:130920969,共同进行优质数据集的交流、收集和建设工作"
] |
[
"TAGS\n#license-apache-2.0 #region-us \n",
"# ShareGPT-Chinese-English-90k 中英文双语人机问答数据集 \n中英文平行双语优质人机问答数据集,覆盖真实复杂场景下的用户提问。用于训练高质量的对话模型 (比那些通过反复调用api接口生成机器模拟问答的数据在指令分布上更鲁棒)\n特点:\n- 1.同时提供意义表达完全相同的中英文平行对照语料,可进行双语对话模型训练。\n- 2.所有问题均非人为臆想加上api轮询拟造的假数据(如Moss),更加符合真实用户场景的指令分布和提问表达。\n- 3.sharegpt数据集是由网友自发分享而收集到的,相当于有一层非常天然的过滤(通过人类感觉),筛除了大部分体验不好的对话。 \n\n补充:该数据收集于chatGPT还未表现出明显智力退化的时间点。(猜测一方面可能是官方为了减小开支把150B的gpt3.5替换成10b左右的蒸馏版本了,另一方面可能是由于引入了更多的拒绝答复导致模型连接知识逻辑的程度退化)\n\n\n优秀对话llm的训练离不开高质量的多轮对话数据集,如果你也想成为志愿者 \n欢迎加入数据集QQ群:130920969,共同进行优质数据集的交流、收集和建设工作"
] |
[
14,
291
] |
[
"passage: TAGS\n#license-apache-2.0 #region-us \n# ShareGPT-Chinese-English-90k 中英文双语人机问答数据集 \n中英文平行双语优质人机问答数据集,覆盖真实复杂场景下的用户提问。用于训练高质量的对话模型 (比那些通过反复调用api接口生成机器模拟问答的数据在指令分布上更鲁棒)\n特点:\n- 1.同时提供意义表达完全相同的中英文平行对照语料,可进行双语对话模型训练。\n- 2.所有问题均非人为臆想加上api轮询拟造的假数据(如Moss),更加符合真实用户场景的指令分布和提问表达。\n- 3.sharegpt数据集是由网友自发分享而收集到的,相当于有一层非常天然的过滤(通过人类感觉),筛除了大部分体验不好的对话。 \n\n补充:该数据收集于chatGPT还未表现出明显智力退化的时间点。(猜测一方面可能是官方为了减小开支把150B的gpt3.5替换成10b左右的蒸馏版本了,另一方面可能是由于引入了更多的拒绝答复导致模型连接知识逻辑的程度退化)\n\n\n优秀对话llm的训练离不开高质量的多轮对话数据集,如果你也想成为志愿者 \n欢迎加入数据集QQ群:130920969,共同进行优质数据集的交流、收集和建设工作"
] |
71eaa65ba33771100e51d23f4c6ce81a7c0533a0
|
# Dataset Card for "PKDD_BERT_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_BERT_Finetuned
|
[
"region:us"
] |
2023-08-18T00:52:57+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211880373, "dataset_size": 154145212.5}}
|
2023-08-23T03:53:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_BERT_Finetuned"
More Information needed
|
[
"# Dataset Card for \"PKDD_BERT_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_BERT_Finetuned\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_BERT_Finetuned\"\n\nMore Information needed"
] |
8c368916fd38272cd00b1a4de1f54f96cbd418d3
|
# Dataset of koakuma/小悪魔/소악마 (Touhou)
This is the dataset of koakuma/小悪魔/소악마 (Touhou), containing 500 images and their tags.
The core tags of this character are `head_wings, wings, red_hair, long_hair, bat_wings, red_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 502.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 336.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1098 | 665.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 462.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1098 | 861.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/koakuma_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/koakuma_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, book, shirt, simple_background, solo, long_sleeves, red_necktie, vest, white_background, looking_at_viewer, skirt_set, black_thighhighs, open_mouth, :d, zettai_ryouiki |
| 1 | 32 |  |  |  |  |  | 1girl, red_necktie, solo, white_shirt, black_vest, collared_shirt, looking_at_viewer, simple_background, bangs, hair_between_eyes, blush, smile, white_background, black_skirt, closed_mouth, juliet_sleeves, upper_body, very_long_hair, cowboy_shot, open_mouth, pointy_ears |
| 2 | 5 |  |  |  |  |  | 1girl, solo, blush, book, red_necktie, one_eye_closed |
| 3 | 11 |  |  |  |  |  | 1girl, book, necktie, solo, black_thighhighs, blush, zettai_ryouiki, demon_tail |
| 4 | 5 |  |  |  |  |  | 1girl, blush, large_breasts, solo, navel, nipples, black_panties, black_thighhighs, demon_tail, underwear_only, bow_panties, bra, lingerie, looking_at_viewer, lying, medium_breasts |
| 5 | 24 |  |  |  |  |  | 1girl, large_breasts, solo, looking_at_viewer, pointy_ears, smile, blush, marker_(medium), very_long_hair, uneven_eyes, curvy, simple_background, white_background, millipen_(medium), navel, cleavage, swimsuit, convenient_censoring, nude |
| 6 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, large_breasts, nipples, open_mouth, sex, solo_focus, vaginal, cowgirl_position, girl_on_top, penis, censored, assertive_female, completely_nude, cum_in_pussy, demon_wings, looking_at_viewer, navel, pink_hair, pointy_ears, pov, saliva, smile, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | book | shirt | simple_background | solo | long_sleeves | red_necktie | vest | white_background | looking_at_viewer | skirt_set | black_thighhighs | open_mouth | :d | zettai_ryouiki | white_shirt | black_vest | collared_shirt | bangs | hair_between_eyes | blush | smile | black_skirt | closed_mouth | juliet_sleeves | upper_body | very_long_hair | cowboy_shot | pointy_ears | one_eye_closed | necktie | demon_tail | large_breasts | navel | nipples | black_panties | underwear_only | bow_panties | bra | lingerie | lying | medium_breasts | marker_(medium) | uneven_eyes | curvy | millipen_(medium) | cleavage | swimsuit | convenient_censoring | nude | 1boy | hetero | sex | solo_focus | vaginal | cowgirl_position | girl_on_top | penis | censored | assertive_female | completely_nude | cum_in_pussy | demon_wings | pink_hair | pov | saliva | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:--------------------|:-------|:---------------|:--------------|:-------|:-------------------|:--------------------|:------------|:-------------------|:-------------|:-----|:-----------------|:--------------|:-------------|:-----------------|:--------|:--------------------|:--------|:--------|:--------------|:---------------|:-----------------|:-------------|:-----------------|:--------------|:--------------|:-----------------|:----------|:-------------|:----------------|:--------|:----------|:----------------|:-----------------|:--------------|:------|:-----------|:--------|:-----------------|:------------------|:--------------|:--------|:--------------------|:-----------|:-----------|:-----------------------|:-------|:-------|:---------|:------|:-------------|:----------|:-------------------|:--------------|:--------|:-----------|:-------------------|:------------------|:---------------|:--------------|:------------|:------|:---------|:--------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 32 |  |  |  |  |  | X | | | X | X | | X | | X | X | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | | X | | X | | | | | | | | | | | | | | X | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | | X | | | | | | | X | | | X | | | | | | X | | | | | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | X | | | | | X | | X | | | | | | | | | X | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 24 |  |  |  |  |  | X | | | X | X | | | | X | X | | | | | | | | | | | X | X | | | | | X | | X | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | | | | | X | | | X | | | | | | | | X | X | | | | | | | X | | | | X | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/koakuma_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T00:54:28+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T12:22:17+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of koakuma/小悪魔/소악마 (Touhou)
===================================
This is the dataset of koakuma/小悪魔/소악마 (Touhou), containing 500 images and their tags.
The core tags of this character are 'head\_wings, wings, red\_hair, long\_hair, bat\_wings, red\_eyes, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
ccd9a18a2ac2bb0b29bc732a642935a665e6cd1d
|
# Dataset Card for "PKDD_RoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_RoBERTa_Finetuned
|
[
"region:us"
] |
2023-08-18T01:04:55+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211879550, "dataset_size": 154145212.5}}
|
2023-08-23T04:05:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_RoBERTa_Finetuned"
More Information needed
|
[
"# Dataset Card for \"PKDD_RoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_RoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_RoBERTa_Finetuned\"\n\nMore Information needed"
] |
67a9ffd929eea5e060a20d053a9049020e5d680a
|
# Dataset of futatsuiwa_mamizou/二ッ岩マミゾウ (Touhou)
This is the dataset of futatsuiwa_mamizou/二ッ岩マミゾウ (Touhou), containing 500 images and their tags.
The core tags of this character are `brown_hair, animal_ears, glasses, raccoon_ears, leaf_on_head, short_hair, raccoon_tail, tail, brown_eyes, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 511.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 334.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1084 | 643.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 465.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1084 | 831.74 MiB | [Download](https://huggingface.co/datasets/CyberHarem/futatsuiwa_mamizou_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/futatsuiwa_mamizou_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, leaf, solo, pince-nez, skirt, smile, bloomers, notepad, bottle, one_eye_closed, sandals, chibi, open_mouth |
| 1 | 8 |  |  |  |  |  | 1girl, leaf, sandals, skirt, solo, smile, pince-nez, sitting |
| 2 | 13 |  |  |  |  |  | 1girl, leaf, smile, solo, bell, hat, kiseru, gourd, skirt, notepad, pince-nez, clog_sandals, sitting |
| 3 | 16 |  |  |  |  |  | 1girl, brown_shirt, leaf, solo, closed_mouth, brown_skirt, raccoon_girl, short_sleeves, simple_background, smile, bangs, holding_smoking_pipe, kiseru, looking_at_viewer, full_body, white_background, :3, bell, hat, round_eyewear, sandals, sitting |
| 4 | 27 |  |  |  |  |  | leaf, 1girl, solo, bangs, green_kimono, looking_at_viewer, long_sleeves, smile, checkered_scarf, raccoon_girl, wide_sleeves, closed_mouth, :3, kiseru, haori, holding_smoking_pipe, one-hour_drawing_challenge, round_eyewear, smoke |
| 5 | 5 |  |  |  |  |  | 1girl, blush, large_breasts, leaf, looking_at_viewer, nipples, nude, solo, pince-nez, smile, barefoot, simple_background, lying, pussy, white_background |
| 6 | 12 |  |  |  |  |  | 1boy, 1girl, blush, hetero, leaf, solo_focus, nipples, penis, large_breasts, sex, vaginal, bar_censor, female_pubic_hair, nude, open_mouth, smile, cum_in_pussy, navel |
| 7 | 7 |  |  |  |  |  | 1girl, leaf, office_lady, solo, pencil_skirt, smile, black_jacket, black_skirt, brown_pantyhose, large_breasts, long_sleeves, looking_at_viewer, looking_over_eyewear, skirt_suit, sunglasses, white_shirt, alternate_costume, bangs, black_footwear, black_pantyhose, collared_shirt, crossed_legs, holding, sitting |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | leaf | solo | pince-nez | skirt | smile | bloomers | notepad | bottle | one_eye_closed | sandals | chibi | open_mouth | sitting | bell | hat | kiseru | gourd | clog_sandals | brown_shirt | closed_mouth | brown_skirt | raccoon_girl | short_sleeves | simple_background | bangs | holding_smoking_pipe | looking_at_viewer | full_body | white_background | :3 | round_eyewear | green_kimono | long_sleeves | checkered_scarf | wide_sleeves | haori | one-hour_drawing_challenge | smoke | blush | large_breasts | nipples | nude | barefoot | lying | pussy | 1boy | hetero | solo_focus | penis | sex | vaginal | bar_censor | female_pubic_hair | cum_in_pussy | navel | office_lady | pencil_skirt | black_jacket | black_skirt | brown_pantyhose | looking_over_eyewear | skirt_suit | sunglasses | white_shirt | alternate_costume | black_footwear | black_pantyhose | collared_shirt | crossed_legs | holding |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-------|:------------|:--------|:--------|:-----------|:----------|:---------|:-----------------|:----------|:--------|:-------------|:----------|:-------|:------|:---------|:--------|:---------------|:--------------|:---------------|:--------------|:---------------|:----------------|:--------------------|:--------|:-----------------------|:--------------------|:------------|:-------------------|:-----|:----------------|:---------------|:---------------|:------------------|:---------------|:--------|:-----------------------------|:--------|:--------|:----------------|:----------|:-------|:-----------|:--------|:--------|:-------|:---------|:-------------|:--------|:------|:----------|:-------------|:--------------------|:---------------|:--------|:--------------|:---------------|:---------------|:--------------|:------------------|:-----------------------|:-------------|:-------------|:--------------|:--------------------|:-----------------|:------------------|:-----------------|:---------------|:----------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | X | X | X | X | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | X | X | X | | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | X | | | X | | | | | X | | | X | X | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 27 |  |  |  |  |  | X | X | X | | | X | | | | | | | | | | | X | | | | X | | X | | | X | X | X | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | X | | | X | | X | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | | | X | | | | | | | | X | | | | | | | | | | | | X | | X | | | | | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/futatsuiwa_mamizou_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T01:09:19+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T22:08:39+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of futatsuiwa\_mamizou/二ッ岩マミゾウ (Touhou)
===============================================
This is the dataset of futatsuiwa\_mamizou/二ッ岩マミゾウ (Touhou), containing 500 images and their tags.
The core tags of this character are 'brown\_hair, animal\_ears, glasses, raccoon\_ears, leaf\_on\_head, short\_hair, raccoon\_tail, tail, brown\_eyes, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
b83093d3a286ff07d2731261080d5c32b2f93316
|
# Dataset Card for "PKDD_DistilRoBERTa_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_DistilRoBERTa_Finetuned
|
[
"region:us"
] |
2023-08-18T01:14:06+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211880357, "dataset_size": 154145212.5}}
|
2023-08-23T04:15:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_DistilRoBERTa_Finetuned"
More Information needed
|
[
"# Dataset Card for \"PKDD_DistilRoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_DistilRoBERTa_Finetuned\"\n\nMore Information needed"
] |
[
6,
23
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_DistilRoBERTa_Finetuned\"\n\nMore Information needed"
] |
1e742c8177e83ab1c582c7f99ab9143c27bed3e5
|
# Dataset Card for "PKDD_GPT2_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_GPT2_Finetuned
|
[
"region:us"
] |
2023-08-18T01:30:40+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 115608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 38536305.0, "num_examples": 12500}], "download_size": 211871323, "dataset_size": 154145212.5}}
|
2023-08-23T04:31:34+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_GPT2_Finetuned"
More Information needed
|
[
"# Dataset Card for \"PKDD_GPT2_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_GPT2_Finetuned\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_GPT2_Finetuned\"\n\nMore Information needed"
] |
20662fea40e8a8597e83ccf7acae84d9a6c2ccc5
|
## Dataset Introduction
This dataset is a collection of malaysian texts in the Malay, English, Chinese, and Tamil languages, gathered by Malaysia AI volunteers through web crawling of malaysian websites.
The dataset amounts to approximately 80 GB of text data, and has undergone deduplication process.
## Project Link
To learn more about the ongoing project and updates related to this dataset, visit the project board on GitHub:
https://github.com/users/huseinzol05/projects/1/views/1
## Github Repo
Our data preprocessing, and deduplication processes are transparent and open for review. You can find the code and documentation related to these processes in https://github.com/malaysia-ai/text-dataset-dedup
## Data Format
All the dataset is standardized in JSONL (JSON Lines) format, with each line containing a text snippet.
|
malaysia-ai/dedup-text-dataset
|
[
"language:ms",
"language:en",
"language:zh",
"language:ta",
"region:us"
] |
2023-08-18T01:32:57+00:00
|
{"language": ["ms", "en", "zh", "ta"]}
|
2023-11-21T02:16:37+00:00
|
[] |
[
"ms",
"en",
"zh",
"ta"
] |
TAGS
#language-Malay (macrolanguage) #language-English #language-Chinese #language-Tamil #region-us
|
## Dataset Introduction
This dataset is a collection of malaysian texts in the Malay, English, Chinese, and Tamil languages, gathered by Malaysia AI volunteers through web crawling of malaysian websites.
The dataset amounts to approximately 80 GB of text data, and has undergone deduplication process.
## Project Link
To learn more about the ongoing project and updates related to this dataset, visit the project board on GitHub:
URL
## Github Repo
Our data preprocessing, and deduplication processes are transparent and open for review. You can find the code and documentation related to these processes in URL
## Data Format
All the dataset is standardized in JSONL (JSON Lines) format, with each line containing a text snippet.
|
[
"## Dataset Introduction\n\nThis dataset is a collection of malaysian texts in the Malay, English, Chinese, and Tamil languages, gathered by Malaysia AI volunteers through web crawling of malaysian websites. \nThe dataset amounts to approximately 80 GB of text data, and has undergone deduplication process.",
"## Project Link\n\nTo learn more about the ongoing project and updates related to this dataset, visit the project board on GitHub:\n\nURL",
"## Github Repo\n\nOur data preprocessing, and deduplication processes are transparent and open for review. You can find the code and documentation related to these processes in URL",
"## Data Format\nAll the dataset is standardized in JSONL (JSON Lines) format, with each line containing a text snippet."
] |
[
"TAGS\n#language-Malay (macrolanguage) #language-English #language-Chinese #language-Tamil #region-us \n",
"## Dataset Introduction\n\nThis dataset is a collection of malaysian texts in the Malay, English, Chinese, and Tamil languages, gathered by Malaysia AI volunteers through web crawling of malaysian websites. \nThe dataset amounts to approximately 80 GB of text data, and has undergone deduplication process.",
"## Project Link\n\nTo learn more about the ongoing project and updates related to this dataset, visit the project board on GitHub:\n\nURL",
"## Github Repo\n\nOur data preprocessing, and deduplication processes are transparent and open for review. You can find the code and documentation related to these processes in URL",
"## Data Format\nAll the dataset is standardized in JSONL (JSON Lines) format, with each line containing a text snippet."
] |
[
29,
70,
28,
39,
33
] |
[
"passage: TAGS\n#language-Malay (macrolanguage) #language-English #language-Chinese #language-Tamil #region-us \n## Dataset Introduction\n\nThis dataset is a collection of malaysian texts in the Malay, English, Chinese, and Tamil languages, gathered by Malaysia AI volunteers through web crawling of malaysian websites. \nThe dataset amounts to approximately 80 GB of text data, and has undergone deduplication process.## Project Link\n\nTo learn more about the ongoing project and updates related to this dataset, visit the project board on GitHub:\n\nURL## Github Repo\n\nOur data preprocessing, and deduplication processes are transparent and open for review. You can find the code and documentation related to these processes in URL## Data Format\nAll the dataset is standardized in JSONL (JSON Lines) format, with each line containing a text snippet."
] |
a4aae46e702e5cd72d22aae3e05aed4c3afb0000
|
# Dataset Card for "MuraTransformed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
KhalfounMehdi/MuraTransformed
|
[
"region:us"
] |
2023-08-18T01:36:43+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": "int64"}, {"name": "pixel_values", "sequence": {"sequence": {"sequence": "float32"}}}], "splits": [{"name": "train", "num_bytes": 27563908768.375, "num_examples": 40005}], "download_size": 6481648040, "dataset_size": 27563908768.375}}
|
2023-08-18T01:49:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "MuraTransformed"
More Information needed
|
[
"# Dataset Card for \"MuraTransformed\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"MuraTransformed\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"MuraTransformed\"\n\nMore Information needed"
] |
c4e30a1b89084bfe9552f2c53d67f98e3548d1fb
|
# Dataset of tatara_kogasa/多々良小傘/타타라코가사 (Touhou)
This is the dataset of tatara_kogasa/多々良小傘/타타라코가사 (Touhou), containing 500 images and their tags.
The core tags of this character are `blue_hair, short_hair, red_eyes, blue_eyes, heterochromia`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 565.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatara_kogasa_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 366.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatara_kogasa_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1094 | 703.67 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatara_kogasa_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 515.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatara_kogasa_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1094 | 919.17 MiB | [Download](https://huggingface.co/datasets/CyberHarem/tatara_kogasa_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/tatara_kogasa_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, geta, karakasa_obake, purple_umbrella, smile, solo, tongue, skirt, open_mouth |
| 1 | 10 |  |  |  |  |  | 1girl, :p, karakasa_obake, purple_umbrella, solo, geta |
| 2 | 9 |  |  |  |  |  | 1girl, :p, karakasa_obake, purple_umbrella, skirt, solo, smile, blush |
| 3 | 9 |  |  |  |  |  | 1girl, juliet_sleeves, karakasa_obake, skirt, smile, solo, looking_at_viewer, shirt, vest, open_mouth, tongue, geta, white_background |
| 4 | 25 |  |  |  |  |  | 1girl, solo, blue_vest, holding_umbrella, juliet_sleeves, karakasa_obake, looking_at_viewer, white_shirt, blue_skirt, bangs, smile, purple_umbrella, :p, blush, open_mouth |
| 5 | 6 |  |  |  |  |  | 1girl, bangs, blue_skirt, blue_vest, closed_mouth, holding_umbrella, juliet_sleeves, karakasa_obake, solo, standing, white_shirt, full_body, geta, looking_at_viewer |
| 6 | 5 |  |  |  |  |  | 1girl, blush, hetero, nipples, sex, solo_focus, vaginal, 1boy, girl_on_top, navel, open_mouth, pov, completely_nude, cowgirl_position, looking_at_viewer, mosaic_censoring, penis, cum_in_pussy, female_pubic_hair, large_breasts, spread_legs, sweat |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | geta | karakasa_obake | purple_umbrella | smile | solo | tongue | skirt | open_mouth | :p | blush | juliet_sleeves | looking_at_viewer | shirt | vest | white_background | blue_vest | holding_umbrella | white_shirt | blue_skirt | bangs | closed_mouth | standing | full_body | hetero | nipples | sex | solo_focus | vaginal | 1boy | girl_on_top | navel | pov | completely_nude | cowgirl_position | mosaic_censoring | penis | cum_in_pussy | female_pubic_hair | large_breasts | spread_legs | sweat |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:-----------------|:------------------|:--------|:-------|:---------|:--------|:-------------|:-----|:--------|:-----------------|:--------------------|:--------|:-------|:-------------------|:------------|:-------------------|:--------------|:-------------|:--------|:---------------|:-----------|:------------|:---------|:----------|:------|:-------------|:----------|:-------|:--------------|:--------|:------|:------------------|:-------------------|:-------------------|:--------|:---------------|:--------------------|:----------------|:--------------|:--------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | X | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 9 |  |  |  |  |  | X | X | X | | X | X | X | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 25 |  |  |  |  |  | X | | X | X | X | X | | | X | X | X | X | X | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | | | X | | | | | | X | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | | | | X | | X | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/tatara_kogasa_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T01:40:53+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T12:45:49+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of tatara\_kogasa/多々良小傘/타타라코가사 (Touhou)
===============================================
This is the dataset of tatara\_kogasa/多々良小傘/타타라코가사 (Touhou), containing 500 images and their tags.
The core tags of this character are 'blue\_hair, short\_hair, red\_eyes, blue\_eyes, heterochromia', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
0620b3f9af24f9af39f36389977770bbad25365d
|
# Dataset of wakasagihime/わかさぎ姫/와카사기히메 (Touhou)
This is the dataset of wakasagihime/わかさぎ姫/와카사기히메 (Touhou), containing 500 images and their tags.
The core tags of this character are `blue_hair, monster_girl, short_hair, blue_eyes, drill_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 660.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wakasagihime_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 393.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wakasagihime_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1129 | 787.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wakasagihime_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 589.99 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wakasagihime_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1129 | 1.04 GiB | [Download](https://huggingface.co/datasets/CyberHarem/wakasagihime_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/wakasagihime_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, green_kimono, head_fins, long_sleeves, looking_at_viewer, mermaid, obi, open_mouth, solo, wide_sleeves, frilled_kimono, simple_background, white_background, :d, blush |
| 1 | 6 |  |  |  |  |  | 1girl, frilled_kimono, green_kimono, head_fins, long_sleeves, mermaid, obi, open_mouth, smile, solo, wide_sleeves, looking_at_viewer, blush, drill_locks |
| 2 | 16 |  |  |  |  |  | 1girl, head_fins, long_sleeves, mermaid, obi, smile, solo, wide_sleeves, open_mouth, green_kimono |
| 3 | 8 |  |  |  |  |  | 1girl, air_bubble, green_kimono, head_fins, mermaid, obi, solo, underwater, wide_sleeves, long_sleeves, open_mouth, smile |
| 4 | 9 |  |  |  |  |  | 1girl, air_bubble, head_fins, kimono, long_sleeves, mermaid, obi, solo, underwater, wide_sleeves, looking_at_viewer, frills, smile |
| 5 | 6 |  |  |  |  |  | 1girl, blush, green_kimono, head_fins, long_sleeves, mermaid, obi, solo, wide_sleeves, looking_at_viewer, smile |
| 6 | 11 |  |  |  |  |  | 1girl, closed_mouth, frilled_kimono, green_kimono, head_fins, long_sleeves, looking_at_viewer, mermaid, obi, smile, solo, wide_sleeves, bangs, drill_locks, blush, large_breasts |
| 7 | 7 |  |  |  |  |  | 1girl, bangs, blush, frilled_kimono, green_kimono, head_fins, long_sleeves, looking_at_viewer, mermaid, obi, solo, underwater, wide_sleeves, air_bubble, smile, hair_between_eyes, open_mouth |
| 8 | 5 |  |  |  |  |  | 1girl, bangs, drill_locks, frilled_kimono, green_kimono, head_fins, long_sleeves, looking_at_viewer, mermaid, obi, solo, underwater, wide_sleeves, full_body, air_bubble, closed_mouth, one-hour_drawing_challenge, open_mouth, smile |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_kimono | head_fins | long_sleeves | looking_at_viewer | mermaid | obi | open_mouth | solo | wide_sleeves | frilled_kimono | simple_background | white_background | :d | blush | smile | drill_locks | air_bubble | underwater | kimono | frills | closed_mouth | bangs | large_breasts | hair_between_eyes | full_body | one-hour_drawing_challenge |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:------------|:---------------|:--------------------|:----------|:------|:-------------|:-------|:---------------|:-----------------|:--------------------|:-------------------|:-----|:--------|:--------|:--------------|:-------------|:-------------|:---------|:---------|:---------------|:--------|:----------------|:--------------------|:------------|:-----------------------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | X | | | | | | | | | | |
| 2 | 16 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | | | | | | X | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | X | X | | X | X | X | X | X | | | | | | X | | X | X | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | X | X | X | X | | X | X | | | | | | X | | X | X | X | X | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | | | | | X | X | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | | | X | X | X | | | | | X | X | X | | | |
| 7 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | X | X | | X | X | | | | X | | X | | |
| 8 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | X | X | X | X | | | X | X | | | X | X |
|
CyberHarem/wakasagihime_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T02:02:37+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T20:15:22+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of wakasagihime/わかさぎ姫/와카사기히메 (Touhou)
=============================================
This is the dataset of wakasagihime/わかさぎ姫/와카사기히메 (Touhou), containing 500 images and their tags.
The core tags of this character are 'blue\_hair, monster\_girl, short\_hair, blue\_eyes, drill\_hair, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
855a6eeada7eb95eeb5eb95210431521d6a7958f
|
# Dataset Card for "truthful_qa-ja-v0.3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
HachiML/truthful_qa-ja-v0.3
|
[
"region:us"
] |
2023-08-18T02:06:01+00:00
|
{"dataset_info": {"config_name": "generation", "features": [{"name": "id", "dtype": "int64"}, {"name": "type", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "best_answer", "dtype": "string"}, {"name": "correct_answers", "sequence": "string"}, {"name": "incorrect_answers", "sequence": "string"}, {"name": "source", "dtype": "string"}, {"name": "question_en", "dtype": "string"}, {"name": "best_answer_en", "dtype": "string"}, {"name": "correct_answers_en", "sequence": "string"}, {"name": "incorrect_answers_en", "sequence": "string"}, {"name": "meta", "struct": [{"name": "kenlm_score", "struct": [{"name": "best_answer", "dtype": "float64"}, {"name": "correct_answers", "sequence": "float64"}, {"name": "incorrect_answers", "sequence": "float64"}, {"name": "question", "dtype": "float64"}]}]}], "splits": [{"name": "validation", "num_bytes": 1034072, "num_examples": 817}], "download_size": 530242, "dataset_size": 1034072}, "configs": [{"config_name": "generation", "data_files": [{"split": "validation", "path": "generation/validation-*"}]}]}
|
2023-08-27T01:10:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "truthful_qa-ja-v0.3"
More Information needed
|
[
"# Dataset Card for \"truthful_qa-ja-v0.3\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"truthful_qa-ja-v0.3\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"truthful_qa-ja-v0.3\"\n\nMore Information needed"
] |
d2ab734ec88669c3c124f4527b21599b71284d68
|
# Dataset of nazrin/ナズーリン/나즈린 (Touhou)
This is the dataset of nazrin/ナズーリン/나즈린 (Touhou), containing 500 images and their tags.
The core tags of this character are `animal_ears, mouse_ears, grey_hair, short_hair, red_eyes, mouse_tail, tail`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 515.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nazrin_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 343.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nazrin_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1180 | 692.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nazrin_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 477.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nazrin_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1180 | 886.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nazrin_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nazrin_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 34 |  |  |  |  |  | 1girl, solo, mouse, pendant, basket, dowsing_rod, capelet |
| 1 | 7 |  |  |  |  |  | 1girl, pendant, solo, capelet, dowsing_rod |
| 2 | 5 |  |  |  |  |  | 1girl, capelet, dowsing_rod, looking_at_viewer, pendant, solo, dress, long_sleeves, open_mouth, simple_background, white_background |
| 3 | 7 |  |  |  |  |  | 1girl, capelet, cloud, pendant, sky, solo, dowsing_rod, dress, day, open_mouth, smile |
| 4 | 9 |  |  |  |  |  | 1girl, bangs, long_sleeves, mouse_girl, open_mouth, solo, white_shirt, full_body, looking_at_viewer, pendant, grey_skirt, grey_vest, grey_dress, holding, white_background, black_footwear, blush, layered_clothes, standing, white_socks, blue_capelet, dowsing_rod, shoes, skirt_set |
| 5 | 8 |  |  |  |  |  | 1girl, alternate_costume, long_sleeves, looking_at_viewer, solo, floral_print, hair_flower, bangs, smile, wide_sleeves, closed_mouth, mouse_girl, obi, happy_new_year, holding, year_of_the_rat, 2020, blue_kimono, blush, cowboy_shot, floral_background, red_flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | mouse | pendant | basket | dowsing_rod | capelet | looking_at_viewer | dress | long_sleeves | open_mouth | simple_background | white_background | cloud | sky | day | smile | bangs | mouse_girl | white_shirt | full_body | grey_skirt | grey_vest | grey_dress | holding | black_footwear | blush | layered_clothes | standing | white_socks | blue_capelet | shoes | skirt_set | alternate_costume | floral_print | hair_flower | wide_sleeves | closed_mouth | obi | happy_new_year | year_of_the_rat | 2020 | blue_kimono | cowboy_shot | floral_background | red_flower |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:----------|:---------|:--------------|:----------|:--------------------|:--------|:---------------|:-------------|:--------------------|:-------------------|:--------|:------|:------|:--------|:--------|:-------------|:--------------|:------------|:-------------|:------------|:-------------|:----------|:-----------------|:--------|:------------------|:-----------|:--------------|:---------------|:--------|:------------|:--------------------|:---------------|:--------------|:---------------|:---------------|:------|:-----------------|:------------------|:-------|:--------------|:--------------|:--------------------|:-------------|
| 0 | 34 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | X | X | | X | | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | X | | X | | X | | X | X | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | X | | | | | | X | | X | | | | | | | X | X | X | | | | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/nazrin_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T02:30:14+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T10:23:53+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of nazrin/ナズーリン/나즈린 (Touhou)
====================================
This is the dataset of nazrin/ナズーリン/나즈린 (Touhou), containing 500 images and their tags.
The core tags of this character are 'animal\_ears, mouse\_ears, grey\_hair, short\_hair, red\_eyes, mouse\_tail, tail', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
5d2a0ebaa654da952eec095cbd563e7a124e83b6
|
# Dataset of hieda_no_akyuu/ひえだのあきゅう/稗田阿求 (Touhou)
This is the dataset of hieda_no_akyuu/ひえだのあきゅう/稗田阿求 (Touhou), containing 266 images and their tags.
The core tags of this character are `hair_ornament, hair_flower, purple_hair, short_hair, purple_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 266 | 303.40 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 266 | 214.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 562 | 390.00 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 266 | 284.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 562 | 478.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hieda_no_akyuu_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hieda_no_akyuu_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------|
| 0 | 6 |  |  |  |  |  | 1girl, flower, kimono, open_mouth, solo, smile |
| 1 | 23 |  |  |  |  |  | 1girl, flower, kimono, solo, smile, scroll |
| 2 | 7 |  |  |  |  |  | 1girl, calligraphy_brush, flower, solo, kimono, open_mouth, scroll, smile |
| 3 | 5 |  |  |  |  |  | 1girl, butterfly, flower, solo, petals, profile, green_kimono |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | flower | kimono | open_mouth | solo | smile | scroll | calligraphy_brush | butterfly | petals | profile | green_kimono |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------|:-------------|:-------|:--------|:---------|:--------------------|:------------|:---------|:----------|:---------------|
| 0 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | | | | |
| 1 | 23 |  |  |  |  |  | X | X | X | | X | X | X | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | | X | | | | X | X | X | X |
|
CyberHarem/hieda_no_akyuu_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T02:46:08+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T19:38:00+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of hieda\_no\_akyuu/ひえだのあきゅう/稗田阿求 (Touhou)
==================================================
This is the dataset of hieda\_no\_akyuu/ひえだのあきゅう/稗田阿求 (Touhou), containing 266 images and their tags.
The core tags of this character are 'hair\_ornament, hair\_flower, purple\_hair, short\_hair, purple\_eyes', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
d3e197b86b36d9be3f6c79655c65ecb8ce539660
|
# Dataset Card for imSitu
## Dataset Description
**Homepage:** http://imsitu.org/
**Repository:** https://github.com/my89/imSitu;
- The metadata used for imSitu: https://github.com/my89/imSitu#metadata
- The images can be downloaded following: https://github.com/my89/imSitu#images
- This HF dataset loads the `train.json`, `val.json` and `test.json` from the repository
**IMPORTANT NOTE**: The `frames` field in the loaded HF dataset contains a list of json strings (since the data structure for each verb frame is different). To convert the json strings back to dicts, you can refer to the following example:
```
from datasets import load_dataset
import json
dataset = load_dataset("mikewang/imsitu")
print(dataset['train'][0])
frames = [json.loads(obj) for obj in dataset['train'][0]['frames']]
print(frames)
```
**Paper Citation:**
```
@inproceedings{yatskar2016,
title={Situation Recognition: Visual Semantic Role Labeling for Image Understanding},
author={Yatskar, Mark and Zettlemoyer, Luke and Farhadi, Ali},
booktitle={Conference on Computer Vision and Pattern Recognition},
year={2016}
}
```
## Dataset Summary
imSitu is a dataset supporting situation recognition, the problem of producing a concise summary of the situation an image depicts including: (1) the main activity, (2) the participating actors, objects, substances, and locations and most importantly (3) the roles these participants play in the activity. The role set used by imSitu is derived from the linguistic resource FrameNet and the entities are derived from ImageNet. The data in imSitu can be used to create robust algorithms for situation recongntion.
|
mikewang/imsitu
|
[
"language:en",
"region:us"
] |
2023-08-18T02:58:33+00:00
|
{"language": ["en"], "pretty_name": "imSitu dataset"}
|
2023-08-18T03:01:56+00:00
|
[] |
[
"en"
] |
TAGS
#language-English #region-us
|
# Dataset Card for imSitu
## Dataset Description
Homepage: URL
Repository: URL
- The metadata used for imSitu: URL
- The images can be downloaded following: URL
- This HF dataset loads the 'URL', 'URL' and 'URL' from the repository
IMPORTANT NOTE: The 'frames' field in the loaded HF dataset contains a list of json strings (since the data structure for each verb frame is different). To convert the json strings back to dicts, you can refer to the following example:
Paper Citation:
## Dataset Summary
imSitu is a dataset supporting situation recognition, the problem of producing a concise summary of the situation an image depicts including: (1) the main activity, (2) the participating actors, objects, substances, and locations and most importantly (3) the roles these participants play in the activity. The role set used by imSitu is derived from the linguistic resource FrameNet and the entities are derived from ImageNet. The data in imSitu can be used to create robust algorithms for situation recongntion.
|
[
"# Dataset Card for imSitu",
"## Dataset Description\n\nHomepage: URL\n\nRepository: URL\n- The metadata used for imSitu: URL\n- The images can be downloaded following: URL\n- This HF dataset loads the 'URL', 'URL' and 'URL' from the repository\n\nIMPORTANT NOTE: The 'frames' field in the loaded HF dataset contains a list of json strings (since the data structure for each verb frame is different). To convert the json strings back to dicts, you can refer to the following example:\n\n\nPaper Citation:",
"## Dataset Summary\nimSitu is a dataset supporting situation recognition, the problem of producing a concise summary of the situation an image depicts including: (1) the main activity, (2) the participating actors, objects, substances, and locations and most importantly (3) the roles these participants play in the activity. The role set used by imSitu is derived from the linguistic resource FrameNet and the entities are derived from ImageNet. The data in imSitu can be used to create robust algorithms for situation recongntion."
] |
[
"TAGS\n#language-English #region-us \n",
"# Dataset Card for imSitu",
"## Dataset Description\n\nHomepage: URL\n\nRepository: URL\n- The metadata used for imSitu: URL\n- The images can be downloaded following: URL\n- This HF dataset loads the 'URL', 'URL' and 'URL' from the repository\n\nIMPORTANT NOTE: The 'frames' field in the loaded HF dataset contains a list of json strings (since the data structure for each verb frame is different). To convert the json strings back to dicts, you can refer to the following example:\n\n\nPaper Citation:",
"## Dataset Summary\nimSitu is a dataset supporting situation recognition, the problem of producing a concise summary of the situation an image depicts including: (1) the main activity, (2) the participating actors, objects, substances, and locations and most importantly (3) the roles these participants play in the activity. The role set used by imSitu is derived from the linguistic resource FrameNet and the entities are derived from ImageNet. The data in imSitu can be used to create robust algorithms for situation recongntion."
] |
[
10,
8,
123,
124
] |
[
"passage: TAGS\n#language-English #region-us \n# Dataset Card for imSitu## Dataset Description\n\nHomepage: URL\n\nRepository: URL\n- The metadata used for imSitu: URL\n- The images can be downloaded following: URL\n- This HF dataset loads the 'URL', 'URL' and 'URL' from the repository\n\nIMPORTANT NOTE: The 'frames' field in the loaded HF dataset contains a list of json strings (since the data structure for each verb frame is different). To convert the json strings back to dicts, you can refer to the following example:\n\n\nPaper Citation:## Dataset Summary\nimSitu is a dataset supporting situation recognition, the problem of producing a concise summary of the situation an image depicts including: (1) the main activity, (2) the participating actors, objects, substances, and locations and most importantly (3) the roles these participants play in the activity. The role set used by imSitu is derived from the linguistic resource FrameNet and the entities are derived from ImageNet. The data in imSitu can be used to create robust algorithms for situation recongntion."
] |
ae840da4851e6d8196f35483495428d6bce8aa73
|
# Dataset Card for "text2sql-spider-processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sartmis1/text2sql-spider
|
[
"region:us"
] |
2023-08-18T03:00:10+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "question", "dtype": "string"}, {"name": "query", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1319601, "num_examples": 7000}], "download_size": 339745, "dataset_size": 1319601}}
|
2023-08-18T03:01:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "text2sql-spider-processed"
More Information needed
|
[
"# Dataset Card for \"text2sql-spider-processed\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"text2sql-spider-processed\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"text2sql-spider-processed\"\n\nMore Information needed"
] |
a029b1e5a33af559e5a46dfc2be1d38c848c0ff9
|
# Dataset Card for "PKDD_GPTNEO_Finetuned"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EgilKarlsen/PKDD_GPTNEO_Finetuned
|
[
"region:us"
] |
2023-08-18T03:10:56+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "0", "dtype": "float32"}, {"name": "1", "dtype": "float32"}, {"name": "2", "dtype": "float32"}, {"name": "3", "dtype": "float32"}, {"name": "4", "dtype": "float32"}, {"name": "5", "dtype": "float32"}, {"name": "6", "dtype": "float32"}, {"name": "7", "dtype": "float32"}, {"name": "8", "dtype": "float32"}, {"name": "9", "dtype": "float32"}, {"name": "10", "dtype": "float32"}, {"name": "11", "dtype": "float32"}, {"name": "12", "dtype": "float32"}, {"name": "13", "dtype": "float32"}, {"name": "14", "dtype": "float32"}, {"name": "15", "dtype": "float32"}, {"name": "16", "dtype": "float32"}, {"name": "17", "dtype": "float32"}, {"name": "18", "dtype": "float32"}, {"name": "19", "dtype": "float32"}, {"name": "20", "dtype": "float32"}, {"name": "21", "dtype": "float32"}, {"name": "22", "dtype": "float32"}, {"name": "23", "dtype": "float32"}, {"name": "24", "dtype": "float32"}, {"name": "25", "dtype": "float32"}, {"name": "26", "dtype": "float32"}, {"name": "27", "dtype": "float32"}, {"name": "28", "dtype": "float32"}, {"name": "29", "dtype": "float32"}, {"name": "30", "dtype": "float32"}, {"name": "31", "dtype": "float32"}, {"name": "32", "dtype": "float32"}, {"name": "33", "dtype": "float32"}, {"name": "34", "dtype": "float32"}, {"name": "35", "dtype": "float32"}, {"name": "36", "dtype": "float32"}, {"name": "37", "dtype": "float32"}, {"name": "38", "dtype": "float32"}, {"name": "39", "dtype": "float32"}, {"name": "40", "dtype": "float32"}, {"name": "41", "dtype": "float32"}, {"name": "42", "dtype": "float32"}, {"name": "43", "dtype": "float32"}, {"name": "44", "dtype": "float32"}, {"name": "45", "dtype": "float32"}, {"name": "46", "dtype": "float32"}, {"name": "47", "dtype": "float32"}, {"name": "48", "dtype": "float32"}, {"name": "49", "dtype": "float32"}, {"name": "50", "dtype": "float32"}, {"name": "51", "dtype": "float32"}, {"name": "52", "dtype": "float32"}, {"name": "53", "dtype": "float32"}, {"name": "54", "dtype": "float32"}, {"name": "55", "dtype": "float32"}, {"name": "56", "dtype": "float32"}, {"name": "57", "dtype": "float32"}, {"name": "58", "dtype": "float32"}, {"name": "59", "dtype": "float32"}, {"name": "60", "dtype": "float32"}, {"name": "61", "dtype": "float32"}, {"name": "62", "dtype": "float32"}, {"name": "63", "dtype": "float32"}, {"name": "64", "dtype": "float32"}, {"name": "65", "dtype": "float32"}, {"name": "66", "dtype": "float32"}, {"name": "67", "dtype": "float32"}, {"name": "68", "dtype": "float32"}, {"name": "69", "dtype": "float32"}, {"name": "70", "dtype": "float32"}, {"name": "71", "dtype": "float32"}, {"name": "72", "dtype": "float32"}, {"name": "73", "dtype": "float32"}, {"name": "74", "dtype": "float32"}, {"name": "75", "dtype": "float32"}, {"name": "76", "dtype": "float32"}, {"name": "77", "dtype": "float32"}, {"name": "78", "dtype": "float32"}, {"name": "79", "dtype": "float32"}, {"name": "80", "dtype": "float32"}, {"name": "81", "dtype": "float32"}, {"name": "82", "dtype": "float32"}, {"name": "83", "dtype": "float32"}, {"name": "84", "dtype": "float32"}, {"name": "85", "dtype": "float32"}, {"name": "86", "dtype": "float32"}, {"name": "87", "dtype": "float32"}, {"name": "88", "dtype": "float32"}, {"name": "89", "dtype": "float32"}, {"name": "90", "dtype": "float32"}, {"name": "91", "dtype": "float32"}, {"name": "92", "dtype": "float32"}, {"name": "93", "dtype": "float32"}, {"name": "94", "dtype": "float32"}, {"name": "95", "dtype": "float32"}, {"name": "96", "dtype": "float32"}, {"name": "97", "dtype": "float32"}, {"name": "98", "dtype": "float32"}, {"name": "99", "dtype": "float32"}, {"name": "100", "dtype": "float32"}, {"name": "101", "dtype": "float32"}, {"name": "102", "dtype": "float32"}, {"name": "103", "dtype": "float32"}, {"name": "104", "dtype": "float32"}, {"name": "105", "dtype": "float32"}, {"name": "106", "dtype": "float32"}, {"name": "107", "dtype": "float32"}, {"name": "108", "dtype": "float32"}, {"name": "109", "dtype": "float32"}, {"name": "110", "dtype": "float32"}, {"name": "111", "dtype": "float32"}, {"name": "112", "dtype": "float32"}, {"name": "113", "dtype": "float32"}, {"name": "114", "dtype": "float32"}, {"name": "115", "dtype": "float32"}, {"name": "116", "dtype": "float32"}, {"name": "117", "dtype": "float32"}, {"name": "118", "dtype": "float32"}, {"name": "119", "dtype": "float32"}, {"name": "120", "dtype": "float32"}, {"name": "121", "dtype": "float32"}, {"name": "122", "dtype": "float32"}, {"name": "123", "dtype": "float32"}, {"name": "124", "dtype": "float32"}, {"name": "125", "dtype": "float32"}, {"name": "126", "dtype": "float32"}, {"name": "127", "dtype": "float32"}, {"name": "128", "dtype": "float32"}, {"name": "129", "dtype": "float32"}, {"name": "130", "dtype": "float32"}, {"name": "131", "dtype": "float32"}, {"name": "132", "dtype": "float32"}, {"name": "133", "dtype": "float32"}, {"name": "134", "dtype": "float32"}, {"name": "135", "dtype": "float32"}, {"name": "136", "dtype": "float32"}, {"name": "137", "dtype": "float32"}, {"name": "138", "dtype": "float32"}, {"name": "139", "dtype": "float32"}, {"name": "140", "dtype": "float32"}, {"name": "141", "dtype": "float32"}, {"name": "142", "dtype": "float32"}, {"name": "143", "dtype": "float32"}, {"name": "144", "dtype": "float32"}, {"name": "145", "dtype": "float32"}, {"name": "146", "dtype": "float32"}, {"name": "147", "dtype": "float32"}, {"name": "148", "dtype": "float32"}, {"name": "149", "dtype": "float32"}, {"name": "150", "dtype": "float32"}, {"name": "151", "dtype": "float32"}, {"name": "152", "dtype": "float32"}, {"name": "153", "dtype": "float32"}, {"name": "154", "dtype": "float32"}, {"name": "155", "dtype": "float32"}, {"name": "156", "dtype": "float32"}, {"name": "157", "dtype": "float32"}, {"name": "158", "dtype": "float32"}, {"name": "159", "dtype": "float32"}, {"name": "160", "dtype": "float32"}, {"name": "161", "dtype": "float32"}, {"name": "162", "dtype": "float32"}, {"name": "163", "dtype": "float32"}, {"name": "164", "dtype": "float32"}, {"name": "165", "dtype": "float32"}, {"name": "166", "dtype": "float32"}, {"name": "167", "dtype": "float32"}, {"name": "168", "dtype": "float32"}, {"name": "169", "dtype": "float32"}, {"name": "170", "dtype": "float32"}, {"name": "171", "dtype": "float32"}, {"name": "172", "dtype": "float32"}, {"name": "173", "dtype": "float32"}, {"name": "174", "dtype": "float32"}, {"name": "175", "dtype": "float32"}, {"name": "176", "dtype": "float32"}, {"name": "177", "dtype": "float32"}, {"name": "178", "dtype": "float32"}, {"name": "179", "dtype": "float32"}, {"name": "180", "dtype": "float32"}, {"name": "181", "dtype": "float32"}, {"name": "182", "dtype": "float32"}, {"name": "183", "dtype": "float32"}, {"name": "184", "dtype": "float32"}, {"name": "185", "dtype": "float32"}, {"name": "186", "dtype": "float32"}, {"name": "187", "dtype": "float32"}, {"name": "188", "dtype": "float32"}, {"name": "189", "dtype": "float32"}, {"name": "190", "dtype": "float32"}, {"name": "191", "dtype": "float32"}, {"name": "192", "dtype": "float32"}, {"name": "193", "dtype": "float32"}, {"name": "194", "dtype": "float32"}, {"name": "195", "dtype": "float32"}, {"name": "196", "dtype": "float32"}, {"name": "197", "dtype": "float32"}, {"name": "198", "dtype": "float32"}, {"name": "199", "dtype": "float32"}, {"name": "200", "dtype": "float32"}, {"name": "201", "dtype": "float32"}, {"name": "202", "dtype": "float32"}, {"name": "203", "dtype": "float32"}, {"name": "204", "dtype": "float32"}, {"name": "205", "dtype": "float32"}, {"name": "206", "dtype": "float32"}, {"name": "207", "dtype": "float32"}, {"name": "208", "dtype": "float32"}, {"name": "209", "dtype": "float32"}, {"name": "210", "dtype": "float32"}, {"name": "211", "dtype": "float32"}, {"name": "212", "dtype": "float32"}, {"name": "213", "dtype": "float32"}, {"name": "214", "dtype": "float32"}, {"name": "215", "dtype": "float32"}, {"name": "216", "dtype": "float32"}, {"name": "217", "dtype": "float32"}, {"name": "218", "dtype": "float32"}, {"name": "219", "dtype": "float32"}, {"name": "220", "dtype": "float32"}, {"name": "221", "dtype": "float32"}, {"name": "222", "dtype": "float32"}, {"name": "223", "dtype": "float32"}, {"name": "224", "dtype": "float32"}, {"name": "225", "dtype": "float32"}, {"name": "226", "dtype": "float32"}, {"name": "227", "dtype": "float32"}, {"name": "228", "dtype": "float32"}, {"name": "229", "dtype": "float32"}, {"name": "230", "dtype": "float32"}, {"name": "231", "dtype": "float32"}, {"name": "232", "dtype": "float32"}, {"name": "233", "dtype": "float32"}, {"name": "234", "dtype": "float32"}, {"name": "235", "dtype": "float32"}, {"name": "236", "dtype": "float32"}, {"name": "237", "dtype": "float32"}, {"name": "238", "dtype": "float32"}, {"name": "239", "dtype": "float32"}, {"name": "240", "dtype": "float32"}, {"name": "241", "dtype": "float32"}, {"name": "242", "dtype": "float32"}, {"name": "243", "dtype": "float32"}, {"name": "244", "dtype": "float32"}, {"name": "245", "dtype": "float32"}, {"name": "246", "dtype": "float32"}, {"name": "247", "dtype": "float32"}, {"name": "248", "dtype": "float32"}, {"name": "249", "dtype": "float32"}, {"name": "250", "dtype": "float32"}, {"name": "251", "dtype": "float32"}, {"name": "252", "dtype": "float32"}, {"name": "253", "dtype": "float32"}, {"name": "254", "dtype": "float32"}, {"name": "255", "dtype": "float32"}, {"name": "256", "dtype": "float32"}, {"name": "257", "dtype": "float32"}, {"name": "258", "dtype": "float32"}, {"name": "259", "dtype": "float32"}, {"name": "260", "dtype": "float32"}, {"name": "261", "dtype": "float32"}, {"name": "262", "dtype": "float32"}, {"name": "263", "dtype": "float32"}, {"name": "264", "dtype": "float32"}, {"name": "265", "dtype": "float32"}, {"name": "266", "dtype": "float32"}, {"name": "267", "dtype": "float32"}, {"name": "268", "dtype": "float32"}, {"name": "269", "dtype": "float32"}, {"name": "270", "dtype": "float32"}, {"name": "271", "dtype": "float32"}, {"name": "272", "dtype": "float32"}, {"name": "273", "dtype": "float32"}, {"name": "274", "dtype": "float32"}, {"name": "275", "dtype": "float32"}, {"name": "276", "dtype": "float32"}, {"name": "277", "dtype": "float32"}, {"name": "278", "dtype": "float32"}, {"name": "279", "dtype": "float32"}, {"name": "280", "dtype": "float32"}, {"name": "281", "dtype": "float32"}, {"name": "282", "dtype": "float32"}, {"name": "283", "dtype": "float32"}, {"name": "284", "dtype": "float32"}, {"name": "285", "dtype": "float32"}, {"name": "286", "dtype": "float32"}, {"name": "287", "dtype": "float32"}, {"name": "288", "dtype": "float32"}, {"name": "289", "dtype": "float32"}, {"name": "290", "dtype": "float32"}, {"name": "291", "dtype": "float32"}, {"name": "292", "dtype": "float32"}, {"name": "293", "dtype": "float32"}, {"name": "294", "dtype": "float32"}, {"name": "295", "dtype": "float32"}, {"name": "296", "dtype": "float32"}, {"name": "297", "dtype": "float32"}, {"name": "298", "dtype": "float32"}, {"name": "299", "dtype": "float32"}, {"name": "300", "dtype": "float32"}, {"name": "301", "dtype": "float32"}, {"name": "302", "dtype": "float32"}, {"name": "303", "dtype": "float32"}, {"name": "304", "dtype": "float32"}, {"name": "305", "dtype": "float32"}, {"name": "306", "dtype": "float32"}, {"name": "307", "dtype": "float32"}, {"name": "308", "dtype": "float32"}, {"name": "309", "dtype": "float32"}, {"name": "310", "dtype": "float32"}, {"name": "311", "dtype": "float32"}, {"name": "312", "dtype": "float32"}, {"name": "313", "dtype": "float32"}, {"name": "314", "dtype": "float32"}, {"name": "315", "dtype": "float32"}, {"name": "316", "dtype": "float32"}, {"name": "317", "dtype": "float32"}, {"name": "318", "dtype": "float32"}, {"name": "319", "dtype": "float32"}, {"name": "320", "dtype": "float32"}, {"name": "321", "dtype": "float32"}, {"name": "322", "dtype": "float32"}, {"name": "323", "dtype": "float32"}, {"name": "324", "dtype": "float32"}, {"name": "325", "dtype": "float32"}, {"name": "326", "dtype": "float32"}, {"name": "327", "dtype": "float32"}, {"name": "328", "dtype": "float32"}, {"name": "329", "dtype": "float32"}, {"name": "330", "dtype": "float32"}, {"name": "331", "dtype": "float32"}, {"name": "332", "dtype": "float32"}, {"name": "333", "dtype": "float32"}, {"name": "334", "dtype": "float32"}, {"name": "335", "dtype": "float32"}, {"name": "336", "dtype": "float32"}, {"name": "337", "dtype": "float32"}, {"name": "338", "dtype": "float32"}, {"name": "339", "dtype": "float32"}, {"name": "340", "dtype": "float32"}, {"name": "341", "dtype": "float32"}, {"name": "342", "dtype": "float32"}, {"name": "343", "dtype": "float32"}, {"name": "344", "dtype": "float32"}, {"name": "345", "dtype": "float32"}, {"name": "346", "dtype": "float32"}, {"name": "347", "dtype": "float32"}, {"name": "348", "dtype": "float32"}, {"name": "349", "dtype": "float32"}, {"name": "350", "dtype": "float32"}, {"name": "351", "dtype": "float32"}, {"name": "352", "dtype": "float32"}, {"name": "353", "dtype": "float32"}, {"name": "354", "dtype": "float32"}, {"name": "355", "dtype": "float32"}, {"name": "356", "dtype": "float32"}, {"name": "357", "dtype": "float32"}, {"name": "358", "dtype": "float32"}, {"name": "359", "dtype": "float32"}, {"name": "360", "dtype": "float32"}, {"name": "361", "dtype": "float32"}, {"name": "362", "dtype": "float32"}, {"name": "363", "dtype": "float32"}, {"name": "364", "dtype": "float32"}, {"name": "365", "dtype": "float32"}, {"name": "366", "dtype": "float32"}, {"name": "367", "dtype": "float32"}, {"name": "368", "dtype": "float32"}, {"name": "369", "dtype": "float32"}, {"name": "370", "dtype": "float32"}, {"name": "371", "dtype": "float32"}, {"name": "372", "dtype": "float32"}, {"name": "373", "dtype": "float32"}, {"name": "374", "dtype": "float32"}, {"name": "375", "dtype": "float32"}, {"name": "376", "dtype": "float32"}, {"name": "377", "dtype": "float32"}, {"name": "378", "dtype": "float32"}, {"name": "379", "dtype": "float32"}, {"name": "380", "dtype": "float32"}, {"name": "381", "dtype": "float32"}, {"name": "382", "dtype": "float32"}, {"name": "383", "dtype": "float32"}, {"name": "384", "dtype": "float32"}, {"name": "385", "dtype": "float32"}, {"name": "386", "dtype": "float32"}, {"name": "387", "dtype": "float32"}, {"name": "388", "dtype": "float32"}, {"name": "389", "dtype": "float32"}, {"name": "390", "dtype": "float32"}, {"name": "391", "dtype": "float32"}, {"name": "392", "dtype": "float32"}, {"name": "393", "dtype": "float32"}, {"name": "394", "dtype": "float32"}, {"name": "395", "dtype": "float32"}, {"name": "396", "dtype": "float32"}, {"name": "397", "dtype": "float32"}, {"name": "398", "dtype": "float32"}, {"name": "399", "dtype": "float32"}, {"name": "400", "dtype": "float32"}, {"name": "401", "dtype": "float32"}, {"name": "402", "dtype": "float32"}, {"name": "403", "dtype": "float32"}, {"name": "404", "dtype": "float32"}, {"name": "405", "dtype": "float32"}, {"name": "406", "dtype": "float32"}, {"name": "407", "dtype": "float32"}, {"name": "408", "dtype": "float32"}, {"name": "409", "dtype": "float32"}, {"name": "410", "dtype": "float32"}, {"name": "411", "dtype": "float32"}, {"name": "412", "dtype": "float32"}, {"name": "413", "dtype": "float32"}, {"name": "414", "dtype": "float32"}, {"name": "415", "dtype": "float32"}, {"name": "416", "dtype": "float32"}, {"name": "417", "dtype": "float32"}, {"name": "418", "dtype": "float32"}, {"name": "419", "dtype": "float32"}, {"name": "420", "dtype": "float32"}, {"name": "421", "dtype": "float32"}, {"name": "422", "dtype": "float32"}, {"name": "423", "dtype": "float32"}, {"name": "424", "dtype": "float32"}, {"name": "425", "dtype": "float32"}, {"name": "426", "dtype": "float32"}, {"name": "427", "dtype": "float32"}, {"name": "428", "dtype": "float32"}, {"name": "429", "dtype": "float32"}, {"name": "430", "dtype": "float32"}, {"name": "431", "dtype": "float32"}, {"name": "432", "dtype": "float32"}, {"name": "433", "dtype": "float32"}, {"name": "434", "dtype": "float32"}, {"name": "435", "dtype": "float32"}, {"name": "436", "dtype": "float32"}, {"name": "437", "dtype": "float32"}, {"name": "438", "dtype": "float32"}, {"name": "439", "dtype": "float32"}, {"name": "440", "dtype": "float32"}, {"name": "441", "dtype": "float32"}, {"name": "442", "dtype": "float32"}, {"name": "443", "dtype": "float32"}, {"name": "444", "dtype": "float32"}, {"name": "445", "dtype": "float32"}, {"name": "446", "dtype": "float32"}, {"name": "447", "dtype": "float32"}, {"name": "448", "dtype": "float32"}, {"name": "449", "dtype": "float32"}, {"name": "450", "dtype": "float32"}, {"name": "451", "dtype": "float32"}, {"name": "452", "dtype": "float32"}, {"name": "453", "dtype": "float32"}, {"name": "454", "dtype": "float32"}, {"name": "455", "dtype": "float32"}, {"name": "456", "dtype": "float32"}, {"name": "457", "dtype": "float32"}, {"name": "458", "dtype": "float32"}, {"name": "459", "dtype": "float32"}, {"name": "460", "dtype": "float32"}, {"name": "461", "dtype": "float32"}, {"name": "462", "dtype": "float32"}, {"name": "463", "dtype": "float32"}, {"name": "464", "dtype": "float32"}, {"name": "465", "dtype": "float32"}, {"name": "466", "dtype": "float32"}, {"name": "467", "dtype": "float32"}, {"name": "468", "dtype": "float32"}, {"name": "469", "dtype": "float32"}, {"name": "470", "dtype": "float32"}, {"name": "471", "dtype": "float32"}, {"name": "472", "dtype": "float32"}, {"name": "473", "dtype": "float32"}, {"name": "474", "dtype": "float32"}, {"name": "475", "dtype": "float32"}, {"name": "476", "dtype": "float32"}, {"name": "477", "dtype": "float32"}, {"name": "478", "dtype": "float32"}, {"name": "479", "dtype": "float32"}, {"name": "480", "dtype": "float32"}, {"name": "481", "dtype": "float32"}, {"name": "482", "dtype": "float32"}, {"name": "483", "dtype": "float32"}, {"name": "484", "dtype": "float32"}, {"name": "485", "dtype": "float32"}, {"name": "486", "dtype": "float32"}, {"name": "487", "dtype": "float32"}, {"name": "488", "dtype": "float32"}, {"name": "489", "dtype": "float32"}, {"name": "490", "dtype": "float32"}, {"name": "491", "dtype": "float32"}, {"name": "492", "dtype": "float32"}, {"name": "493", "dtype": "float32"}, {"name": "494", "dtype": "float32"}, {"name": "495", "dtype": "float32"}, {"name": "496", "dtype": "float32"}, {"name": "497", "dtype": "float32"}, {"name": "498", "dtype": "float32"}, {"name": "499", "dtype": "float32"}, {"name": "500", "dtype": "float32"}, {"name": "501", "dtype": "float32"}, {"name": "502", "dtype": "float32"}, {"name": "503", "dtype": "float32"}, {"name": "504", "dtype": "float32"}, {"name": "505", "dtype": "float32"}, {"name": "506", "dtype": "float32"}, {"name": "507", "dtype": "float32"}, {"name": "508", "dtype": "float32"}, {"name": "509", "dtype": "float32"}, {"name": "510", "dtype": "float32"}, {"name": "511", "dtype": "float32"}, {"name": "512", "dtype": "float32"}, {"name": "513", "dtype": "float32"}, {"name": "514", "dtype": "float32"}, {"name": "515", "dtype": "float32"}, {"name": "516", "dtype": "float32"}, {"name": "517", "dtype": "float32"}, {"name": "518", "dtype": "float32"}, {"name": "519", "dtype": "float32"}, {"name": "520", "dtype": "float32"}, {"name": "521", "dtype": "float32"}, {"name": "522", "dtype": "float32"}, {"name": "523", "dtype": "float32"}, {"name": "524", "dtype": "float32"}, {"name": "525", "dtype": "float32"}, {"name": "526", "dtype": "float32"}, {"name": "527", "dtype": "float32"}, {"name": "528", "dtype": "float32"}, {"name": "529", "dtype": "float32"}, {"name": "530", "dtype": "float32"}, {"name": "531", "dtype": "float32"}, {"name": "532", "dtype": "float32"}, {"name": "533", "dtype": "float32"}, {"name": "534", "dtype": "float32"}, {"name": "535", "dtype": "float32"}, {"name": "536", "dtype": "float32"}, {"name": "537", "dtype": "float32"}, {"name": "538", "dtype": "float32"}, {"name": "539", "dtype": "float32"}, {"name": "540", "dtype": "float32"}, {"name": "541", "dtype": "float32"}, {"name": "542", "dtype": "float32"}, {"name": "543", "dtype": "float32"}, {"name": "544", "dtype": "float32"}, {"name": "545", "dtype": "float32"}, {"name": "546", "dtype": "float32"}, {"name": "547", "dtype": "float32"}, {"name": "548", "dtype": "float32"}, {"name": "549", "dtype": "float32"}, {"name": "550", "dtype": "float32"}, {"name": "551", "dtype": "float32"}, {"name": "552", "dtype": "float32"}, {"name": "553", "dtype": "float32"}, {"name": "554", "dtype": "float32"}, {"name": "555", "dtype": "float32"}, {"name": "556", "dtype": "float32"}, {"name": "557", "dtype": "float32"}, {"name": "558", "dtype": "float32"}, {"name": "559", "dtype": "float32"}, {"name": "560", "dtype": "float32"}, {"name": "561", "dtype": "float32"}, {"name": "562", "dtype": "float32"}, {"name": "563", "dtype": "float32"}, {"name": "564", "dtype": "float32"}, {"name": "565", "dtype": "float32"}, {"name": "566", "dtype": "float32"}, {"name": "567", "dtype": "float32"}, {"name": "568", "dtype": "float32"}, {"name": "569", "dtype": "float32"}, {"name": "570", "dtype": "float32"}, {"name": "571", "dtype": "float32"}, {"name": "572", "dtype": "float32"}, {"name": "573", "dtype": "float32"}, {"name": "574", "dtype": "float32"}, {"name": "575", "dtype": "float32"}, {"name": "576", "dtype": "float32"}, {"name": "577", "dtype": "float32"}, {"name": "578", "dtype": "float32"}, {"name": "579", "dtype": "float32"}, {"name": "580", "dtype": "float32"}, {"name": "581", "dtype": "float32"}, {"name": "582", "dtype": "float32"}, {"name": "583", "dtype": "float32"}, {"name": "584", "dtype": "float32"}, {"name": "585", "dtype": "float32"}, {"name": "586", "dtype": "float32"}, {"name": "587", "dtype": "float32"}, {"name": "588", "dtype": "float32"}, {"name": "589", "dtype": "float32"}, {"name": "590", "dtype": "float32"}, {"name": "591", "dtype": "float32"}, {"name": "592", "dtype": "float32"}, {"name": "593", "dtype": "float32"}, {"name": "594", "dtype": "float32"}, {"name": "595", "dtype": "float32"}, {"name": "596", "dtype": "float32"}, {"name": "597", "dtype": "float32"}, {"name": "598", "dtype": "float32"}, {"name": "599", "dtype": "float32"}, {"name": "600", "dtype": "float32"}, {"name": "601", "dtype": "float32"}, {"name": "602", "dtype": "float32"}, {"name": "603", "dtype": "float32"}, {"name": "604", "dtype": "float32"}, {"name": "605", "dtype": "float32"}, {"name": "606", "dtype": "float32"}, {"name": "607", "dtype": "float32"}, {"name": "608", "dtype": "float32"}, {"name": "609", "dtype": "float32"}, {"name": "610", "dtype": "float32"}, {"name": "611", "dtype": "float32"}, {"name": "612", "dtype": "float32"}, {"name": "613", "dtype": "float32"}, {"name": "614", "dtype": "float32"}, {"name": "615", "dtype": "float32"}, {"name": "616", "dtype": "float32"}, {"name": "617", "dtype": "float32"}, {"name": "618", "dtype": "float32"}, {"name": "619", "dtype": "float32"}, {"name": "620", "dtype": "float32"}, {"name": "621", "dtype": "float32"}, {"name": "622", "dtype": "float32"}, {"name": "623", "dtype": "float32"}, {"name": "624", "dtype": "float32"}, {"name": "625", "dtype": "float32"}, {"name": "626", "dtype": "float32"}, {"name": "627", "dtype": "float32"}, {"name": "628", "dtype": "float32"}, {"name": "629", "dtype": "float32"}, {"name": "630", "dtype": "float32"}, {"name": "631", "dtype": "float32"}, {"name": "632", "dtype": "float32"}, {"name": "633", "dtype": "float32"}, {"name": "634", "dtype": "float32"}, {"name": "635", "dtype": "float32"}, {"name": "636", "dtype": "float32"}, {"name": "637", "dtype": "float32"}, {"name": "638", "dtype": "float32"}, {"name": "639", "dtype": "float32"}, {"name": "640", "dtype": "float32"}, {"name": "641", "dtype": "float32"}, {"name": "642", "dtype": "float32"}, {"name": "643", "dtype": "float32"}, {"name": "644", "dtype": "float32"}, {"name": "645", "dtype": "float32"}, {"name": "646", "dtype": "float32"}, {"name": "647", "dtype": "float32"}, {"name": "648", "dtype": "float32"}, {"name": "649", "dtype": "float32"}, {"name": "650", "dtype": "float32"}, {"name": "651", "dtype": "float32"}, {"name": "652", "dtype": "float32"}, {"name": "653", "dtype": "float32"}, {"name": "654", "dtype": "float32"}, {"name": "655", "dtype": "float32"}, {"name": "656", "dtype": "float32"}, {"name": "657", "dtype": "float32"}, {"name": "658", "dtype": "float32"}, {"name": "659", "dtype": "float32"}, {"name": "660", "dtype": "float32"}, {"name": "661", "dtype": "float32"}, {"name": "662", "dtype": "float32"}, {"name": "663", "dtype": "float32"}, {"name": "664", "dtype": "float32"}, {"name": "665", "dtype": "float32"}, {"name": "666", "dtype": "float32"}, {"name": "667", "dtype": "float32"}, {"name": "668", "dtype": "float32"}, {"name": "669", "dtype": "float32"}, {"name": "670", "dtype": "float32"}, {"name": "671", "dtype": "float32"}, {"name": "672", "dtype": "float32"}, {"name": "673", "dtype": "float32"}, {"name": "674", "dtype": "float32"}, {"name": "675", "dtype": "float32"}, {"name": "676", "dtype": "float32"}, {"name": "677", "dtype": "float32"}, {"name": "678", "dtype": "float32"}, {"name": "679", "dtype": "float32"}, {"name": "680", "dtype": "float32"}, {"name": "681", "dtype": "float32"}, {"name": "682", "dtype": "float32"}, {"name": "683", "dtype": "float32"}, {"name": "684", "dtype": "float32"}, {"name": "685", "dtype": "float32"}, {"name": "686", "dtype": "float32"}, {"name": "687", "dtype": "float32"}, {"name": "688", "dtype": "float32"}, {"name": "689", "dtype": "float32"}, {"name": "690", "dtype": "float32"}, {"name": "691", "dtype": "float32"}, {"name": "692", "dtype": "float32"}, {"name": "693", "dtype": "float32"}, {"name": "694", "dtype": "float32"}, {"name": "695", "dtype": "float32"}, {"name": "696", "dtype": "float32"}, {"name": "697", "dtype": "float32"}, {"name": "698", "dtype": "float32"}, {"name": "699", "dtype": "float32"}, {"name": "700", "dtype": "float32"}, {"name": "701", "dtype": "float32"}, {"name": "702", "dtype": "float32"}, {"name": "703", "dtype": "float32"}, {"name": "704", "dtype": "float32"}, {"name": "705", "dtype": "float32"}, {"name": "706", "dtype": "float32"}, {"name": "707", "dtype": "float32"}, {"name": "708", "dtype": "float32"}, {"name": "709", "dtype": "float32"}, {"name": "710", "dtype": "float32"}, {"name": "711", "dtype": "float32"}, {"name": "712", "dtype": "float32"}, {"name": "713", "dtype": "float32"}, {"name": "714", "dtype": "float32"}, {"name": "715", "dtype": "float32"}, {"name": "716", "dtype": "float32"}, {"name": "717", "dtype": "float32"}, {"name": "718", "dtype": "float32"}, {"name": "719", "dtype": "float32"}, {"name": "720", "dtype": "float32"}, {"name": "721", "dtype": "float32"}, {"name": "722", "dtype": "float32"}, {"name": "723", "dtype": "float32"}, {"name": "724", "dtype": "float32"}, {"name": "725", "dtype": "float32"}, {"name": "726", "dtype": "float32"}, {"name": "727", "dtype": "float32"}, {"name": "728", "dtype": "float32"}, {"name": "729", "dtype": "float32"}, {"name": "730", "dtype": "float32"}, {"name": "731", "dtype": "float32"}, {"name": "732", "dtype": "float32"}, {"name": "733", "dtype": "float32"}, {"name": "734", "dtype": "float32"}, {"name": "735", "dtype": "float32"}, {"name": "736", "dtype": "float32"}, {"name": "737", "dtype": "float32"}, {"name": "738", "dtype": "float32"}, {"name": "739", "dtype": "float32"}, {"name": "740", "dtype": "float32"}, {"name": "741", "dtype": "float32"}, {"name": "742", "dtype": "float32"}, {"name": "743", "dtype": "float32"}, {"name": "744", "dtype": "float32"}, {"name": "745", "dtype": "float32"}, {"name": "746", "dtype": "float32"}, {"name": "747", "dtype": "float32"}, {"name": "748", "dtype": "float32"}, {"name": "749", "dtype": "float32"}, {"name": "750", "dtype": "float32"}, {"name": "751", "dtype": "float32"}, {"name": "752", "dtype": "float32"}, {"name": "753", "dtype": "float32"}, {"name": "754", "dtype": "float32"}, {"name": "755", "dtype": "float32"}, {"name": "756", "dtype": "float32"}, {"name": "757", "dtype": "float32"}, {"name": "758", "dtype": "float32"}, {"name": "759", "dtype": "float32"}, {"name": "760", "dtype": "float32"}, {"name": "761", "dtype": "float32"}, {"name": "762", "dtype": "float32"}, {"name": "763", "dtype": "float32"}, {"name": "764", "dtype": "float32"}, {"name": "765", "dtype": "float32"}, {"name": "766", "dtype": "float32"}, {"name": "767", "dtype": "float32"}, {"name": "768", "dtype": "float32"}, {"name": "769", "dtype": "float32"}, {"name": "770", "dtype": "float32"}, {"name": "771", "dtype": "float32"}, {"name": "772", "dtype": "float32"}, {"name": "773", "dtype": "float32"}, {"name": "774", "dtype": "float32"}, {"name": "775", "dtype": "float32"}, {"name": "776", "dtype": "float32"}, {"name": "777", "dtype": "float32"}, {"name": "778", "dtype": "float32"}, {"name": "779", "dtype": "float32"}, {"name": "780", "dtype": "float32"}, {"name": "781", "dtype": "float32"}, {"name": "782", "dtype": "float32"}, {"name": "783", "dtype": "float32"}, {"name": "784", "dtype": "float32"}, {"name": "785", "dtype": "float32"}, {"name": "786", "dtype": "float32"}, {"name": "787", "dtype": "float32"}, {"name": "788", "dtype": "float32"}, {"name": "789", "dtype": "float32"}, {"name": "790", "dtype": "float32"}, {"name": "791", "dtype": "float32"}, {"name": "792", "dtype": "float32"}, {"name": "793", "dtype": "float32"}, {"name": "794", "dtype": "float32"}, {"name": "795", "dtype": "float32"}, {"name": "796", "dtype": "float32"}, {"name": "797", "dtype": "float32"}, {"name": "798", "dtype": "float32"}, {"name": "799", "dtype": "float32"}, {"name": "800", "dtype": "float32"}, {"name": "801", "dtype": "float32"}, {"name": "802", "dtype": "float32"}, {"name": "803", "dtype": "float32"}, {"name": "804", "dtype": "float32"}, {"name": "805", "dtype": "float32"}, {"name": "806", "dtype": "float32"}, {"name": "807", "dtype": "float32"}, {"name": "808", "dtype": "float32"}, {"name": "809", "dtype": "float32"}, {"name": "810", "dtype": "float32"}, {"name": "811", "dtype": "float32"}, {"name": "812", "dtype": "float32"}, {"name": "813", "dtype": "float32"}, {"name": "814", "dtype": "float32"}, {"name": "815", "dtype": "float32"}, {"name": "816", "dtype": "float32"}, {"name": "817", "dtype": "float32"}, {"name": "818", "dtype": "float32"}, {"name": "819", "dtype": "float32"}, {"name": "820", "dtype": "float32"}, {"name": "821", "dtype": "float32"}, {"name": "822", "dtype": "float32"}, {"name": "823", "dtype": "float32"}, {"name": "824", "dtype": "float32"}, {"name": "825", "dtype": "float32"}, {"name": "826", "dtype": "float32"}, {"name": "827", "dtype": "float32"}, {"name": "828", "dtype": "float32"}, {"name": "829", "dtype": "float32"}, {"name": "830", "dtype": "float32"}, {"name": "831", "dtype": "float32"}, {"name": "832", "dtype": "float32"}, {"name": "833", "dtype": "float32"}, {"name": "834", "dtype": "float32"}, {"name": "835", "dtype": "float32"}, {"name": "836", "dtype": "float32"}, {"name": "837", "dtype": "float32"}, {"name": "838", "dtype": "float32"}, {"name": "839", "dtype": "float32"}, {"name": "840", "dtype": "float32"}, {"name": "841", "dtype": "float32"}, {"name": "842", "dtype": "float32"}, {"name": "843", "dtype": "float32"}, {"name": "844", "dtype": "float32"}, {"name": "845", "dtype": "float32"}, {"name": "846", "dtype": "float32"}, {"name": "847", "dtype": "float32"}, {"name": "848", "dtype": "float32"}, {"name": "849", "dtype": "float32"}, {"name": "850", "dtype": "float32"}, {"name": "851", "dtype": "float32"}, {"name": "852", "dtype": "float32"}, {"name": "853", "dtype": "float32"}, {"name": "854", "dtype": "float32"}, {"name": "855", "dtype": "float32"}, {"name": "856", "dtype": "float32"}, {"name": "857", "dtype": "float32"}, {"name": "858", "dtype": "float32"}, {"name": "859", "dtype": "float32"}, {"name": "860", "dtype": "float32"}, {"name": "861", "dtype": "float32"}, {"name": "862", "dtype": "float32"}, {"name": "863", "dtype": "float32"}, {"name": "864", "dtype": "float32"}, {"name": "865", "dtype": "float32"}, {"name": "866", "dtype": "float32"}, {"name": "867", "dtype": "float32"}, {"name": "868", "dtype": "float32"}, {"name": "869", "dtype": "float32"}, {"name": "870", "dtype": "float32"}, {"name": "871", "dtype": "float32"}, {"name": "872", "dtype": "float32"}, {"name": "873", "dtype": "float32"}, {"name": "874", "dtype": "float32"}, {"name": "875", "dtype": "float32"}, {"name": "876", "dtype": "float32"}, {"name": "877", "dtype": "float32"}, {"name": "878", "dtype": "float32"}, {"name": "879", "dtype": "float32"}, {"name": "880", "dtype": "float32"}, {"name": "881", "dtype": "float32"}, {"name": "882", "dtype": "float32"}, {"name": "883", "dtype": "float32"}, {"name": "884", "dtype": "float32"}, {"name": "885", "dtype": "float32"}, {"name": "886", "dtype": "float32"}, {"name": "887", "dtype": "float32"}, {"name": "888", "dtype": "float32"}, {"name": "889", "dtype": "float32"}, {"name": "890", "dtype": "float32"}, {"name": "891", "dtype": "float32"}, {"name": "892", "dtype": "float32"}, {"name": "893", "dtype": "float32"}, {"name": "894", "dtype": "float32"}, {"name": "895", "dtype": "float32"}, {"name": "896", "dtype": "float32"}, {"name": "897", "dtype": "float32"}, {"name": "898", "dtype": "float32"}, {"name": "899", "dtype": "float32"}, {"name": "900", "dtype": "float32"}, {"name": "901", "dtype": "float32"}, {"name": "902", "dtype": "float32"}, {"name": "903", "dtype": "float32"}, {"name": "904", "dtype": "float32"}, {"name": "905", "dtype": "float32"}, {"name": "906", "dtype": "float32"}, {"name": "907", "dtype": "float32"}, {"name": "908", "dtype": "float32"}, {"name": "909", "dtype": "float32"}, {"name": "910", "dtype": "float32"}, {"name": "911", "dtype": "float32"}, {"name": "912", "dtype": "float32"}, {"name": "913", "dtype": "float32"}, {"name": "914", "dtype": "float32"}, {"name": "915", "dtype": "float32"}, {"name": "916", "dtype": "float32"}, {"name": "917", "dtype": "float32"}, {"name": "918", "dtype": "float32"}, {"name": "919", "dtype": "float32"}, {"name": "920", "dtype": "float32"}, {"name": "921", "dtype": "float32"}, {"name": "922", "dtype": "float32"}, {"name": "923", "dtype": "float32"}, {"name": "924", "dtype": "float32"}, {"name": "925", "dtype": "float32"}, {"name": "926", "dtype": "float32"}, {"name": "927", "dtype": "float32"}, {"name": "928", "dtype": "float32"}, {"name": "929", "dtype": "float32"}, {"name": "930", "dtype": "float32"}, {"name": "931", "dtype": "float32"}, {"name": "932", "dtype": "float32"}, {"name": "933", "dtype": "float32"}, {"name": "934", "dtype": "float32"}, {"name": "935", "dtype": "float32"}, {"name": "936", "dtype": "float32"}, {"name": "937", "dtype": "float32"}, {"name": "938", "dtype": "float32"}, {"name": "939", "dtype": "float32"}, {"name": "940", "dtype": "float32"}, {"name": "941", "dtype": "float32"}, {"name": "942", "dtype": "float32"}, {"name": "943", "dtype": "float32"}, {"name": "944", "dtype": "float32"}, {"name": "945", "dtype": "float32"}, {"name": "946", "dtype": "float32"}, {"name": "947", "dtype": "float32"}, {"name": "948", "dtype": "float32"}, {"name": "949", "dtype": "float32"}, {"name": "950", "dtype": "float32"}, {"name": "951", "dtype": "float32"}, {"name": "952", "dtype": "float32"}, {"name": "953", "dtype": "float32"}, {"name": "954", "dtype": "float32"}, {"name": "955", "dtype": "float32"}, {"name": "956", "dtype": "float32"}, {"name": "957", "dtype": "float32"}, {"name": "958", "dtype": "float32"}, {"name": "959", "dtype": "float32"}, {"name": "960", "dtype": "float32"}, {"name": "961", "dtype": "float32"}, {"name": "962", "dtype": "float32"}, {"name": "963", "dtype": "float32"}, {"name": "964", "dtype": "float32"}, {"name": "965", "dtype": "float32"}, {"name": "966", "dtype": "float32"}, {"name": "967", "dtype": "float32"}, {"name": "968", "dtype": "float32"}, {"name": "969", "dtype": "float32"}, {"name": "970", "dtype": "float32"}, {"name": "971", "dtype": "float32"}, {"name": "972", "dtype": "float32"}, {"name": "973", "dtype": "float32"}, {"name": "974", "dtype": "float32"}, {"name": "975", "dtype": "float32"}, {"name": "976", "dtype": "float32"}, {"name": "977", "dtype": "float32"}, {"name": "978", "dtype": "float32"}, {"name": "979", "dtype": "float32"}, {"name": "980", "dtype": "float32"}, {"name": "981", "dtype": "float32"}, {"name": "982", "dtype": "float32"}, {"name": "983", "dtype": "float32"}, {"name": "984", "dtype": "float32"}, {"name": "985", "dtype": "float32"}, {"name": "986", "dtype": "float32"}, {"name": "987", "dtype": "float32"}, {"name": "988", "dtype": "float32"}, {"name": "989", "dtype": "float32"}, {"name": "990", "dtype": "float32"}, {"name": "991", "dtype": "float32"}, {"name": "992", "dtype": "float32"}, {"name": "993", "dtype": "float32"}, {"name": "994", "dtype": "float32"}, {"name": "995", "dtype": "float32"}, {"name": "996", "dtype": "float32"}, {"name": "997", "dtype": "float32"}, {"name": "998", "dtype": "float32"}, {"name": "999", "dtype": "float32"}, {"name": "1000", "dtype": "float32"}, {"name": "1001", "dtype": "float32"}, {"name": "1002", "dtype": "float32"}, {"name": "1003", "dtype": "float32"}, {"name": "1004", "dtype": "float32"}, {"name": "1005", "dtype": "float32"}, {"name": "1006", "dtype": "float32"}, {"name": "1007", "dtype": "float32"}, {"name": "1008", "dtype": "float32"}, {"name": "1009", "dtype": "float32"}, {"name": "1010", "dtype": "float32"}, {"name": "1011", "dtype": "float32"}, {"name": "1012", "dtype": "float32"}, {"name": "1013", "dtype": "float32"}, {"name": "1014", "dtype": "float32"}, {"name": "1015", "dtype": "float32"}, {"name": "1016", "dtype": "float32"}, {"name": "1017", "dtype": "float32"}, {"name": "1018", "dtype": "float32"}, {"name": "1019", "dtype": "float32"}, {"name": "1020", "dtype": "float32"}, {"name": "1021", "dtype": "float32"}, {"name": "1022", "dtype": "float32"}, {"name": "1023", "dtype": "float32"}, {"name": "1024", "dtype": "float32"}, {"name": "1025", "dtype": "float32"}, {"name": "1026", "dtype": "float32"}, {"name": "1027", "dtype": "float32"}, {"name": "1028", "dtype": "float32"}, {"name": "1029", "dtype": "float32"}, {"name": "1030", "dtype": "float32"}, {"name": "1031", "dtype": "float32"}, {"name": "1032", "dtype": "float32"}, {"name": "1033", "dtype": "float32"}, {"name": "1034", "dtype": "float32"}, {"name": "1035", "dtype": "float32"}, {"name": "1036", "dtype": "float32"}, {"name": "1037", "dtype": "float32"}, {"name": "1038", "dtype": "float32"}, {"name": "1039", "dtype": "float32"}, {"name": "1040", "dtype": "float32"}, {"name": "1041", "dtype": "float32"}, {"name": "1042", "dtype": "float32"}, {"name": "1043", "dtype": "float32"}, {"name": "1044", "dtype": "float32"}, {"name": "1045", "dtype": "float32"}, {"name": "1046", "dtype": "float32"}, {"name": "1047", "dtype": "float32"}, {"name": "1048", "dtype": "float32"}, {"name": "1049", "dtype": "float32"}, {"name": "1050", "dtype": "float32"}, {"name": "1051", "dtype": "float32"}, {"name": "1052", "dtype": "float32"}, {"name": "1053", "dtype": "float32"}, {"name": "1054", "dtype": "float32"}, {"name": "1055", "dtype": "float32"}, {"name": "1056", "dtype": "float32"}, {"name": "1057", "dtype": "float32"}, {"name": "1058", "dtype": "float32"}, {"name": "1059", "dtype": "float32"}, {"name": "1060", "dtype": "float32"}, {"name": "1061", "dtype": "float32"}, {"name": "1062", "dtype": "float32"}, {"name": "1063", "dtype": "float32"}, {"name": "1064", "dtype": "float32"}, {"name": "1065", "dtype": "float32"}, {"name": "1066", "dtype": "float32"}, {"name": "1067", "dtype": "float32"}, {"name": "1068", "dtype": "float32"}, {"name": "1069", "dtype": "float32"}, {"name": "1070", "dtype": "float32"}, {"name": "1071", "dtype": "float32"}, {"name": "1072", "dtype": "float32"}, {"name": "1073", "dtype": "float32"}, {"name": "1074", "dtype": "float32"}, {"name": "1075", "dtype": "float32"}, {"name": "1076", "dtype": "float32"}, {"name": "1077", "dtype": "float32"}, {"name": "1078", "dtype": "float32"}, {"name": "1079", "dtype": "float32"}, {"name": "1080", "dtype": "float32"}, {"name": "1081", "dtype": "float32"}, {"name": "1082", "dtype": "float32"}, {"name": "1083", "dtype": "float32"}, {"name": "1084", "dtype": "float32"}, {"name": "1085", "dtype": "float32"}, {"name": "1086", "dtype": "float32"}, {"name": "1087", "dtype": "float32"}, {"name": "1088", "dtype": "float32"}, {"name": "1089", "dtype": "float32"}, {"name": "1090", "dtype": "float32"}, {"name": "1091", "dtype": "float32"}, {"name": "1092", "dtype": "float32"}, {"name": "1093", "dtype": "float32"}, {"name": "1094", "dtype": "float32"}, {"name": "1095", "dtype": "float32"}, {"name": "1096", "dtype": "float32"}, {"name": "1097", "dtype": "float32"}, {"name": "1098", "dtype": "float32"}, {"name": "1099", "dtype": "float32"}, {"name": "1100", "dtype": "float32"}, {"name": "1101", "dtype": "float32"}, {"name": "1102", "dtype": "float32"}, {"name": "1103", "dtype": "float32"}, {"name": "1104", "dtype": "float32"}, {"name": "1105", "dtype": "float32"}, {"name": "1106", "dtype": "float32"}, {"name": "1107", "dtype": "float32"}, {"name": "1108", "dtype": "float32"}, {"name": "1109", "dtype": "float32"}, {"name": "1110", "dtype": "float32"}, {"name": "1111", "dtype": "float32"}, {"name": "1112", "dtype": "float32"}, {"name": "1113", "dtype": "float32"}, {"name": "1114", "dtype": "float32"}, {"name": "1115", "dtype": "float32"}, {"name": "1116", "dtype": "float32"}, {"name": "1117", "dtype": "float32"}, {"name": "1118", "dtype": "float32"}, {"name": "1119", "dtype": "float32"}, {"name": "1120", "dtype": "float32"}, {"name": "1121", "dtype": "float32"}, {"name": "1122", "dtype": "float32"}, {"name": "1123", "dtype": "float32"}, {"name": "1124", "dtype": "float32"}, {"name": "1125", "dtype": "float32"}, {"name": "1126", "dtype": "float32"}, {"name": "1127", "dtype": "float32"}, {"name": "1128", "dtype": "float32"}, {"name": "1129", "dtype": "float32"}, {"name": "1130", "dtype": "float32"}, {"name": "1131", "dtype": "float32"}, {"name": "1132", "dtype": "float32"}, {"name": "1133", "dtype": "float32"}, {"name": "1134", "dtype": "float32"}, {"name": "1135", "dtype": "float32"}, {"name": "1136", "dtype": "float32"}, {"name": "1137", "dtype": "float32"}, {"name": "1138", "dtype": "float32"}, {"name": "1139", "dtype": "float32"}, {"name": "1140", "dtype": "float32"}, {"name": "1141", "dtype": "float32"}, {"name": "1142", "dtype": "float32"}, {"name": "1143", "dtype": "float32"}, {"name": "1144", "dtype": "float32"}, {"name": "1145", "dtype": "float32"}, {"name": "1146", "dtype": "float32"}, {"name": "1147", "dtype": "float32"}, {"name": "1148", "dtype": "float32"}, {"name": "1149", "dtype": "float32"}, {"name": "1150", "dtype": "float32"}, {"name": "1151", "dtype": "float32"}, {"name": "1152", "dtype": "float32"}, {"name": "1153", "dtype": "float32"}, {"name": "1154", "dtype": "float32"}, {"name": "1155", "dtype": "float32"}, {"name": "1156", "dtype": "float32"}, {"name": "1157", "dtype": "float32"}, {"name": "1158", "dtype": "float32"}, {"name": "1159", "dtype": "float32"}, {"name": "1160", "dtype": "float32"}, {"name": "1161", "dtype": "float32"}, {"name": "1162", "dtype": "float32"}, {"name": "1163", "dtype": "float32"}, {"name": "1164", "dtype": "float32"}, {"name": "1165", "dtype": "float32"}, {"name": "1166", "dtype": "float32"}, {"name": "1167", "dtype": "float32"}, {"name": "1168", "dtype": "float32"}, {"name": "1169", "dtype": "float32"}, {"name": "1170", "dtype": "float32"}, {"name": "1171", "dtype": "float32"}, {"name": "1172", "dtype": "float32"}, {"name": "1173", "dtype": "float32"}, {"name": "1174", "dtype": "float32"}, {"name": "1175", "dtype": "float32"}, {"name": "1176", "dtype": "float32"}, {"name": "1177", "dtype": "float32"}, {"name": "1178", "dtype": "float32"}, {"name": "1179", "dtype": "float32"}, {"name": "1180", "dtype": "float32"}, {"name": "1181", "dtype": "float32"}, {"name": "1182", "dtype": "float32"}, {"name": "1183", "dtype": "float32"}, {"name": "1184", "dtype": "float32"}, {"name": "1185", "dtype": "float32"}, {"name": "1186", "dtype": "float32"}, {"name": "1187", "dtype": "float32"}, {"name": "1188", "dtype": "float32"}, {"name": "1189", "dtype": "float32"}, {"name": "1190", "dtype": "float32"}, {"name": "1191", "dtype": "float32"}, {"name": "1192", "dtype": "float32"}, {"name": "1193", "dtype": "float32"}, {"name": "1194", "dtype": "float32"}, {"name": "1195", "dtype": "float32"}, {"name": "1196", "dtype": "float32"}, {"name": "1197", "dtype": "float32"}, {"name": "1198", "dtype": "float32"}, {"name": "1199", "dtype": "float32"}, {"name": "1200", "dtype": "float32"}, {"name": "1201", "dtype": "float32"}, {"name": "1202", "dtype": "float32"}, {"name": "1203", "dtype": "float32"}, {"name": "1204", "dtype": "float32"}, {"name": "1205", "dtype": "float32"}, {"name": "1206", "dtype": "float32"}, {"name": "1207", "dtype": "float32"}, {"name": "1208", "dtype": "float32"}, {"name": "1209", "dtype": "float32"}, {"name": "1210", "dtype": "float32"}, {"name": "1211", "dtype": "float32"}, {"name": "1212", "dtype": "float32"}, {"name": "1213", "dtype": "float32"}, {"name": "1214", "dtype": "float32"}, {"name": "1215", "dtype": "float32"}, {"name": "1216", "dtype": "float32"}, {"name": "1217", "dtype": "float32"}, {"name": "1218", "dtype": "float32"}, {"name": "1219", "dtype": "float32"}, {"name": "1220", "dtype": "float32"}, {"name": "1221", "dtype": "float32"}, {"name": "1222", "dtype": "float32"}, {"name": "1223", "dtype": "float32"}, {"name": "1224", "dtype": "float32"}, {"name": "1225", "dtype": "float32"}, {"name": "1226", "dtype": "float32"}, {"name": "1227", "dtype": "float32"}, {"name": "1228", "dtype": "float32"}, {"name": "1229", "dtype": "float32"}, {"name": "1230", "dtype": "float32"}, {"name": "1231", "dtype": "float32"}, {"name": "1232", "dtype": "float32"}, {"name": "1233", "dtype": "float32"}, {"name": "1234", "dtype": "float32"}, {"name": "1235", "dtype": "float32"}, {"name": "1236", "dtype": "float32"}, {"name": "1237", "dtype": "float32"}, {"name": "1238", "dtype": "float32"}, {"name": "1239", "dtype": "float32"}, {"name": "1240", "dtype": "float32"}, {"name": "1241", "dtype": "float32"}, {"name": "1242", "dtype": "float32"}, {"name": "1243", "dtype": "float32"}, {"name": "1244", "dtype": "float32"}, {"name": "1245", "dtype": "float32"}, {"name": "1246", "dtype": "float32"}, {"name": "1247", "dtype": "float32"}, {"name": "1248", "dtype": "float32"}, {"name": "1249", "dtype": "float32"}, {"name": "1250", "dtype": "float32"}, {"name": "1251", "dtype": "float32"}, {"name": "1252", "dtype": "float32"}, {"name": "1253", "dtype": "float32"}, {"name": "1254", "dtype": "float32"}, {"name": "1255", "dtype": "float32"}, {"name": "1256", "dtype": "float32"}, {"name": "1257", "dtype": "float32"}, {"name": "1258", "dtype": "float32"}, {"name": "1259", "dtype": "float32"}, {"name": "1260", "dtype": "float32"}, {"name": "1261", "dtype": "float32"}, {"name": "1262", "dtype": "float32"}, {"name": "1263", "dtype": "float32"}, {"name": "1264", "dtype": "float32"}, {"name": "1265", "dtype": "float32"}, {"name": "1266", "dtype": "float32"}, {"name": "1267", "dtype": "float32"}, {"name": "1268", "dtype": "float32"}, {"name": "1269", "dtype": "float32"}, {"name": "1270", "dtype": "float32"}, {"name": "1271", "dtype": "float32"}, {"name": "1272", "dtype": "float32"}, {"name": "1273", "dtype": "float32"}, {"name": "1274", "dtype": "float32"}, {"name": "1275", "dtype": "float32"}, {"name": "1276", "dtype": "float32"}, {"name": "1277", "dtype": "float32"}, {"name": "1278", "dtype": "float32"}, {"name": "1279", "dtype": "float32"}, {"name": "1280", "dtype": "float32"}, {"name": "1281", "dtype": "float32"}, {"name": "1282", "dtype": "float32"}, {"name": "1283", "dtype": "float32"}, {"name": "1284", "dtype": "float32"}, {"name": "1285", "dtype": "float32"}, {"name": "1286", "dtype": "float32"}, {"name": "1287", "dtype": "float32"}, {"name": "1288", "dtype": "float32"}, {"name": "1289", "dtype": "float32"}, {"name": "1290", "dtype": "float32"}, {"name": "1291", "dtype": "float32"}, {"name": "1292", "dtype": "float32"}, {"name": "1293", "dtype": "float32"}, {"name": "1294", "dtype": "float32"}, {"name": "1295", "dtype": "float32"}, {"name": "1296", "dtype": "float32"}, {"name": "1297", "dtype": "float32"}, {"name": "1298", "dtype": "float32"}, {"name": "1299", "dtype": "float32"}, {"name": "1300", "dtype": "float32"}, {"name": "1301", "dtype": "float32"}, {"name": "1302", "dtype": "float32"}, {"name": "1303", "dtype": "float32"}, {"name": "1304", "dtype": "float32"}, {"name": "1305", "dtype": "float32"}, {"name": "1306", "dtype": "float32"}, {"name": "1307", "dtype": "float32"}, {"name": "1308", "dtype": "float32"}, {"name": "1309", "dtype": "float32"}, {"name": "1310", "dtype": "float32"}, {"name": "1311", "dtype": "float32"}, {"name": "1312", "dtype": "float32"}, {"name": "1313", "dtype": "float32"}, {"name": "1314", "dtype": "float32"}, {"name": "1315", "dtype": "float32"}, {"name": "1316", "dtype": "float32"}, {"name": "1317", "dtype": "float32"}, {"name": "1318", "dtype": "float32"}, {"name": "1319", "dtype": "float32"}, {"name": "1320", "dtype": "float32"}, {"name": "1321", "dtype": "float32"}, {"name": "1322", "dtype": "float32"}, {"name": "1323", "dtype": "float32"}, {"name": "1324", "dtype": "float32"}, {"name": "1325", "dtype": "float32"}, {"name": "1326", "dtype": "float32"}, {"name": "1327", "dtype": "float32"}, {"name": "1328", "dtype": "float32"}, {"name": "1329", "dtype": "float32"}, {"name": "1330", "dtype": "float32"}, {"name": "1331", "dtype": "float32"}, {"name": "1332", "dtype": "float32"}, {"name": "1333", "dtype": "float32"}, {"name": "1334", "dtype": "float32"}, {"name": "1335", "dtype": "float32"}, {"name": "1336", "dtype": "float32"}, {"name": "1337", "dtype": "float32"}, {"name": "1338", "dtype": "float32"}, {"name": "1339", "dtype": "float32"}, {"name": "1340", "dtype": "float32"}, {"name": "1341", "dtype": "float32"}, {"name": "1342", "dtype": "float32"}, {"name": "1343", "dtype": "float32"}, {"name": "1344", "dtype": "float32"}, {"name": "1345", "dtype": "float32"}, {"name": "1346", "dtype": "float32"}, {"name": "1347", "dtype": "float32"}, {"name": "1348", "dtype": "float32"}, {"name": "1349", "dtype": "float32"}, {"name": "1350", "dtype": "float32"}, {"name": "1351", "dtype": "float32"}, {"name": "1352", "dtype": "float32"}, {"name": "1353", "dtype": "float32"}, {"name": "1354", "dtype": "float32"}, {"name": "1355", "dtype": "float32"}, {"name": "1356", "dtype": "float32"}, {"name": "1357", "dtype": "float32"}, {"name": "1358", "dtype": "float32"}, {"name": "1359", "dtype": "float32"}, {"name": "1360", "dtype": "float32"}, {"name": "1361", "dtype": "float32"}, {"name": "1362", "dtype": "float32"}, {"name": "1363", "dtype": "float32"}, {"name": "1364", "dtype": "float32"}, {"name": "1365", "dtype": "float32"}, {"name": "1366", "dtype": "float32"}, {"name": "1367", "dtype": "float32"}, {"name": "1368", "dtype": "float32"}, {"name": "1369", "dtype": "float32"}, {"name": "1370", "dtype": "float32"}, {"name": "1371", "dtype": "float32"}, {"name": "1372", "dtype": "float32"}, {"name": "1373", "dtype": "float32"}, {"name": "1374", "dtype": "float32"}, {"name": "1375", "dtype": "float32"}, {"name": "1376", "dtype": "float32"}, {"name": "1377", "dtype": "float32"}, {"name": "1378", "dtype": "float32"}, {"name": "1379", "dtype": "float32"}, {"name": "1380", "dtype": "float32"}, {"name": "1381", "dtype": "float32"}, {"name": "1382", "dtype": "float32"}, {"name": "1383", "dtype": "float32"}, {"name": "1384", "dtype": "float32"}, {"name": "1385", "dtype": "float32"}, {"name": "1386", "dtype": "float32"}, {"name": "1387", "dtype": "float32"}, {"name": "1388", "dtype": "float32"}, {"name": "1389", "dtype": "float32"}, {"name": "1390", "dtype": "float32"}, {"name": "1391", "dtype": "float32"}, {"name": "1392", "dtype": "float32"}, {"name": "1393", "dtype": "float32"}, {"name": "1394", "dtype": "float32"}, {"name": "1395", "dtype": "float32"}, {"name": "1396", "dtype": "float32"}, {"name": "1397", "dtype": "float32"}, {"name": "1398", "dtype": "float32"}, {"name": "1399", "dtype": "float32"}, {"name": "1400", "dtype": "float32"}, {"name": "1401", "dtype": "float32"}, {"name": "1402", "dtype": "float32"}, {"name": "1403", "dtype": "float32"}, {"name": "1404", "dtype": "float32"}, {"name": "1405", "dtype": "float32"}, {"name": "1406", "dtype": "float32"}, {"name": "1407", "dtype": "float32"}, {"name": "1408", "dtype": "float32"}, {"name": "1409", "dtype": "float32"}, {"name": "1410", "dtype": "float32"}, {"name": "1411", "dtype": "float32"}, {"name": "1412", "dtype": "float32"}, {"name": "1413", "dtype": "float32"}, {"name": "1414", "dtype": "float32"}, {"name": "1415", "dtype": "float32"}, {"name": "1416", "dtype": "float32"}, {"name": "1417", "dtype": "float32"}, {"name": "1418", "dtype": "float32"}, {"name": "1419", "dtype": "float32"}, {"name": "1420", "dtype": "float32"}, {"name": "1421", "dtype": "float32"}, {"name": "1422", "dtype": "float32"}, {"name": "1423", "dtype": "float32"}, {"name": "1424", "dtype": "float32"}, {"name": "1425", "dtype": "float32"}, {"name": "1426", "dtype": "float32"}, {"name": "1427", "dtype": "float32"}, {"name": "1428", "dtype": "float32"}, {"name": "1429", "dtype": "float32"}, {"name": "1430", "dtype": "float32"}, {"name": "1431", "dtype": "float32"}, {"name": "1432", "dtype": "float32"}, {"name": "1433", "dtype": "float32"}, {"name": "1434", "dtype": "float32"}, {"name": "1435", "dtype": "float32"}, {"name": "1436", "dtype": "float32"}, {"name": "1437", "dtype": "float32"}, {"name": "1438", "dtype": "float32"}, {"name": "1439", "dtype": "float32"}, {"name": "1440", "dtype": "float32"}, {"name": "1441", "dtype": "float32"}, {"name": "1442", "dtype": "float32"}, {"name": "1443", "dtype": "float32"}, {"name": "1444", "dtype": "float32"}, {"name": "1445", "dtype": "float32"}, {"name": "1446", "dtype": "float32"}, {"name": "1447", "dtype": "float32"}, {"name": "1448", "dtype": "float32"}, {"name": "1449", "dtype": "float32"}, {"name": "1450", "dtype": "float32"}, {"name": "1451", "dtype": "float32"}, {"name": "1452", "dtype": "float32"}, {"name": "1453", "dtype": "float32"}, {"name": "1454", "dtype": "float32"}, {"name": "1455", "dtype": "float32"}, {"name": "1456", "dtype": "float32"}, {"name": "1457", "dtype": "float32"}, {"name": "1458", "dtype": "float32"}, {"name": "1459", "dtype": "float32"}, {"name": "1460", "dtype": "float32"}, {"name": "1461", "dtype": "float32"}, {"name": "1462", "dtype": "float32"}, {"name": "1463", "dtype": "float32"}, {"name": "1464", "dtype": "float32"}, {"name": "1465", "dtype": "float32"}, {"name": "1466", "dtype": "float32"}, {"name": "1467", "dtype": "float32"}, {"name": "1468", "dtype": "float32"}, {"name": "1469", "dtype": "float32"}, {"name": "1470", "dtype": "float32"}, {"name": "1471", "dtype": "float32"}, {"name": "1472", "dtype": "float32"}, {"name": "1473", "dtype": "float32"}, {"name": "1474", "dtype": "float32"}, {"name": "1475", "dtype": "float32"}, {"name": "1476", "dtype": "float32"}, {"name": "1477", "dtype": "float32"}, {"name": "1478", "dtype": "float32"}, {"name": "1479", "dtype": "float32"}, {"name": "1480", "dtype": "float32"}, {"name": "1481", "dtype": "float32"}, {"name": "1482", "dtype": "float32"}, {"name": "1483", "dtype": "float32"}, {"name": "1484", "dtype": "float32"}, {"name": "1485", "dtype": "float32"}, {"name": "1486", "dtype": "float32"}, {"name": "1487", "dtype": "float32"}, {"name": "1488", "dtype": "float32"}, {"name": "1489", "dtype": "float32"}, {"name": "1490", "dtype": "float32"}, {"name": "1491", "dtype": "float32"}, {"name": "1492", "dtype": "float32"}, {"name": "1493", "dtype": "float32"}, {"name": "1494", "dtype": "float32"}, {"name": "1495", "dtype": "float32"}, {"name": "1496", "dtype": "float32"}, {"name": "1497", "dtype": "float32"}, {"name": "1498", "dtype": "float32"}, {"name": "1499", "dtype": "float32"}, {"name": "1500", "dtype": "float32"}, {"name": "1501", "dtype": "float32"}, {"name": "1502", "dtype": "float32"}, {"name": "1503", "dtype": "float32"}, {"name": "1504", "dtype": "float32"}, {"name": "1505", "dtype": "float32"}, {"name": "1506", "dtype": "float32"}, {"name": "1507", "dtype": "float32"}, {"name": "1508", "dtype": "float32"}, {"name": "1509", "dtype": "float32"}, {"name": "1510", "dtype": "float32"}, {"name": "1511", "dtype": "float32"}, {"name": "1512", "dtype": "float32"}, {"name": "1513", "dtype": "float32"}, {"name": "1514", "dtype": "float32"}, {"name": "1515", "dtype": "float32"}, {"name": "1516", "dtype": "float32"}, {"name": "1517", "dtype": "float32"}, {"name": "1518", "dtype": "float32"}, {"name": "1519", "dtype": "float32"}, {"name": "1520", "dtype": "float32"}, {"name": "1521", "dtype": "float32"}, {"name": "1522", "dtype": "float32"}, {"name": "1523", "dtype": "float32"}, {"name": "1524", "dtype": "float32"}, {"name": "1525", "dtype": "float32"}, {"name": "1526", "dtype": "float32"}, {"name": "1527", "dtype": "float32"}, {"name": "1528", "dtype": "float32"}, {"name": "1529", "dtype": "float32"}, {"name": "1530", "dtype": "float32"}, {"name": "1531", "dtype": "float32"}, {"name": "1532", "dtype": "float32"}, {"name": "1533", "dtype": "float32"}, {"name": "1534", "dtype": "float32"}, {"name": "1535", "dtype": "float32"}, {"name": "1536", "dtype": "float32"}, {"name": "1537", "dtype": "float32"}, {"name": "1538", "dtype": "float32"}, {"name": "1539", "dtype": "float32"}, {"name": "1540", "dtype": "float32"}, {"name": "1541", "dtype": "float32"}, {"name": "1542", "dtype": "float32"}, {"name": "1543", "dtype": "float32"}, {"name": "1544", "dtype": "float32"}, {"name": "1545", "dtype": "float32"}, {"name": "1546", "dtype": "float32"}, {"name": "1547", "dtype": "float32"}, {"name": "1548", "dtype": "float32"}, {"name": "1549", "dtype": "float32"}, {"name": "1550", "dtype": "float32"}, {"name": "1551", "dtype": "float32"}, {"name": "1552", "dtype": "float32"}, {"name": "1553", "dtype": "float32"}, {"name": "1554", "dtype": "float32"}, {"name": "1555", "dtype": "float32"}, {"name": "1556", "dtype": "float32"}, {"name": "1557", "dtype": "float32"}, {"name": "1558", "dtype": "float32"}, {"name": "1559", "dtype": "float32"}, {"name": "1560", "dtype": "float32"}, {"name": "1561", "dtype": "float32"}, {"name": "1562", "dtype": "float32"}, {"name": "1563", "dtype": "float32"}, {"name": "1564", "dtype": "float32"}, {"name": "1565", "dtype": "float32"}, {"name": "1566", "dtype": "float32"}, {"name": "1567", "dtype": "float32"}, {"name": "1568", "dtype": "float32"}, {"name": "1569", "dtype": "float32"}, {"name": "1570", "dtype": "float32"}, {"name": "1571", "dtype": "float32"}, {"name": "1572", "dtype": "float32"}, {"name": "1573", "dtype": "float32"}, {"name": "1574", "dtype": "float32"}, {"name": "1575", "dtype": "float32"}, {"name": "1576", "dtype": "float32"}, {"name": "1577", "dtype": "float32"}, {"name": "1578", "dtype": "float32"}, {"name": "1579", "dtype": "float32"}, {"name": "1580", "dtype": "float32"}, {"name": "1581", "dtype": "float32"}, {"name": "1582", "dtype": "float32"}, {"name": "1583", "dtype": "float32"}, {"name": "1584", "dtype": "float32"}, {"name": "1585", "dtype": "float32"}, {"name": "1586", "dtype": "float32"}, {"name": "1587", "dtype": "float32"}, {"name": "1588", "dtype": "float32"}, {"name": "1589", "dtype": "float32"}, {"name": "1590", "dtype": "float32"}, {"name": "1591", "dtype": "float32"}, {"name": "1592", "dtype": "float32"}, {"name": "1593", "dtype": "float32"}, {"name": "1594", "dtype": "float32"}, {"name": "1595", "dtype": "float32"}, {"name": "1596", "dtype": "float32"}, {"name": "1597", "dtype": "float32"}, {"name": "1598", "dtype": "float32"}, {"name": "1599", "dtype": "float32"}, {"name": "1600", "dtype": "float32"}, {"name": "1601", "dtype": "float32"}, {"name": "1602", "dtype": "float32"}, {"name": "1603", "dtype": "float32"}, {"name": "1604", "dtype": "float32"}, {"name": "1605", "dtype": "float32"}, {"name": "1606", "dtype": "float32"}, {"name": "1607", "dtype": "float32"}, {"name": "1608", "dtype": "float32"}, {"name": "1609", "dtype": "float32"}, {"name": "1610", "dtype": "float32"}, {"name": "1611", "dtype": "float32"}, {"name": "1612", "dtype": "float32"}, {"name": "1613", "dtype": "float32"}, {"name": "1614", "dtype": "float32"}, {"name": "1615", "dtype": "float32"}, {"name": "1616", "dtype": "float32"}, {"name": "1617", "dtype": "float32"}, {"name": "1618", "dtype": "float32"}, {"name": "1619", "dtype": "float32"}, {"name": "1620", "dtype": "float32"}, {"name": "1621", "dtype": "float32"}, {"name": "1622", "dtype": "float32"}, {"name": "1623", "dtype": "float32"}, {"name": "1624", "dtype": "float32"}, {"name": "1625", "dtype": "float32"}, {"name": "1626", "dtype": "float32"}, {"name": "1627", "dtype": "float32"}, {"name": "1628", "dtype": "float32"}, {"name": "1629", "dtype": "float32"}, {"name": "1630", "dtype": "float32"}, {"name": "1631", "dtype": "float32"}, {"name": "1632", "dtype": "float32"}, {"name": "1633", "dtype": "float32"}, {"name": "1634", "dtype": "float32"}, {"name": "1635", "dtype": "float32"}, {"name": "1636", "dtype": "float32"}, {"name": "1637", "dtype": "float32"}, {"name": "1638", "dtype": "float32"}, {"name": "1639", "dtype": "float32"}, {"name": "1640", "dtype": "float32"}, {"name": "1641", "dtype": "float32"}, {"name": "1642", "dtype": "float32"}, {"name": "1643", "dtype": "float32"}, {"name": "1644", "dtype": "float32"}, {"name": "1645", "dtype": "float32"}, {"name": "1646", "dtype": "float32"}, {"name": "1647", "dtype": "float32"}, {"name": "1648", "dtype": "float32"}, {"name": "1649", "dtype": "float32"}, {"name": "1650", "dtype": "float32"}, {"name": "1651", "dtype": "float32"}, {"name": "1652", "dtype": "float32"}, {"name": "1653", "dtype": "float32"}, {"name": "1654", "dtype": "float32"}, {"name": "1655", "dtype": "float32"}, {"name": "1656", "dtype": "float32"}, {"name": "1657", "dtype": "float32"}, {"name": "1658", "dtype": "float32"}, {"name": "1659", "dtype": "float32"}, {"name": "1660", "dtype": "float32"}, {"name": "1661", "dtype": "float32"}, {"name": "1662", "dtype": "float32"}, {"name": "1663", "dtype": "float32"}, {"name": "1664", "dtype": "float32"}, {"name": "1665", "dtype": "float32"}, {"name": "1666", "dtype": "float32"}, {"name": "1667", "dtype": "float32"}, {"name": "1668", "dtype": "float32"}, {"name": "1669", "dtype": "float32"}, {"name": "1670", "dtype": "float32"}, {"name": "1671", "dtype": "float32"}, {"name": "1672", "dtype": "float32"}, {"name": "1673", "dtype": "float32"}, {"name": "1674", "dtype": "float32"}, {"name": "1675", "dtype": "float32"}, {"name": "1676", "dtype": "float32"}, {"name": "1677", "dtype": "float32"}, {"name": "1678", "dtype": "float32"}, {"name": "1679", "dtype": "float32"}, {"name": "1680", "dtype": "float32"}, {"name": "1681", "dtype": "float32"}, {"name": "1682", "dtype": "float32"}, {"name": "1683", "dtype": "float32"}, {"name": "1684", "dtype": "float32"}, {"name": "1685", "dtype": "float32"}, {"name": "1686", "dtype": "float32"}, {"name": "1687", "dtype": "float32"}, {"name": "1688", "dtype": "float32"}, {"name": "1689", "dtype": "float32"}, {"name": "1690", "dtype": "float32"}, {"name": "1691", "dtype": "float32"}, {"name": "1692", "dtype": "float32"}, {"name": "1693", "dtype": "float32"}, {"name": "1694", "dtype": "float32"}, {"name": "1695", "dtype": "float32"}, {"name": "1696", "dtype": "float32"}, {"name": "1697", "dtype": "float32"}, {"name": "1698", "dtype": "float32"}, {"name": "1699", "dtype": "float32"}, {"name": "1700", "dtype": "float32"}, {"name": "1701", "dtype": "float32"}, {"name": "1702", "dtype": "float32"}, {"name": "1703", "dtype": "float32"}, {"name": "1704", "dtype": "float32"}, {"name": "1705", "dtype": "float32"}, {"name": "1706", "dtype": "float32"}, {"name": "1707", "dtype": "float32"}, {"name": "1708", "dtype": "float32"}, {"name": "1709", "dtype": "float32"}, {"name": "1710", "dtype": "float32"}, {"name": "1711", "dtype": "float32"}, {"name": "1712", "dtype": "float32"}, {"name": "1713", "dtype": "float32"}, {"name": "1714", "dtype": "float32"}, {"name": "1715", "dtype": "float32"}, {"name": "1716", "dtype": "float32"}, {"name": "1717", "dtype": "float32"}, {"name": "1718", "dtype": "float32"}, {"name": "1719", "dtype": "float32"}, {"name": "1720", "dtype": "float32"}, {"name": "1721", "dtype": "float32"}, {"name": "1722", "dtype": "float32"}, {"name": "1723", "dtype": "float32"}, {"name": "1724", "dtype": "float32"}, {"name": "1725", "dtype": "float32"}, {"name": "1726", "dtype": "float32"}, {"name": "1727", "dtype": "float32"}, {"name": "1728", "dtype": "float32"}, {"name": "1729", "dtype": "float32"}, {"name": "1730", "dtype": "float32"}, {"name": "1731", "dtype": "float32"}, {"name": "1732", "dtype": "float32"}, {"name": "1733", "dtype": "float32"}, {"name": "1734", "dtype": "float32"}, {"name": "1735", "dtype": "float32"}, {"name": "1736", "dtype": "float32"}, {"name": "1737", "dtype": "float32"}, {"name": "1738", "dtype": "float32"}, {"name": "1739", "dtype": "float32"}, {"name": "1740", "dtype": "float32"}, {"name": "1741", "dtype": "float32"}, {"name": "1742", "dtype": "float32"}, {"name": "1743", "dtype": "float32"}, {"name": "1744", "dtype": "float32"}, {"name": "1745", "dtype": "float32"}, {"name": "1746", "dtype": "float32"}, {"name": "1747", "dtype": "float32"}, {"name": "1748", "dtype": "float32"}, {"name": "1749", "dtype": "float32"}, {"name": "1750", "dtype": "float32"}, {"name": "1751", "dtype": "float32"}, {"name": "1752", "dtype": "float32"}, {"name": "1753", "dtype": "float32"}, {"name": "1754", "dtype": "float32"}, {"name": "1755", "dtype": "float32"}, {"name": "1756", "dtype": "float32"}, {"name": "1757", "dtype": "float32"}, {"name": "1758", "dtype": "float32"}, {"name": "1759", "dtype": "float32"}, {"name": "1760", "dtype": "float32"}, {"name": "1761", "dtype": "float32"}, {"name": "1762", "dtype": "float32"}, {"name": "1763", "dtype": "float32"}, {"name": "1764", "dtype": "float32"}, {"name": "1765", "dtype": "float32"}, {"name": "1766", "dtype": "float32"}, {"name": "1767", "dtype": "float32"}, {"name": "1768", "dtype": "float32"}, {"name": "1769", "dtype": "float32"}, {"name": "1770", "dtype": "float32"}, {"name": "1771", "dtype": "float32"}, {"name": "1772", "dtype": "float32"}, {"name": "1773", "dtype": "float32"}, {"name": "1774", "dtype": "float32"}, {"name": "1775", "dtype": "float32"}, {"name": "1776", "dtype": "float32"}, {"name": "1777", "dtype": "float32"}, {"name": "1778", "dtype": "float32"}, {"name": "1779", "dtype": "float32"}, {"name": "1780", "dtype": "float32"}, {"name": "1781", "dtype": "float32"}, {"name": "1782", "dtype": "float32"}, {"name": "1783", "dtype": "float32"}, {"name": "1784", "dtype": "float32"}, {"name": "1785", "dtype": "float32"}, {"name": "1786", "dtype": "float32"}, {"name": "1787", "dtype": "float32"}, {"name": "1788", "dtype": "float32"}, {"name": "1789", "dtype": "float32"}, {"name": "1790", "dtype": "float32"}, {"name": "1791", "dtype": "float32"}, {"name": "1792", "dtype": "float32"}, {"name": "1793", "dtype": "float32"}, {"name": "1794", "dtype": "float32"}, {"name": "1795", "dtype": "float32"}, {"name": "1796", "dtype": "float32"}, {"name": "1797", "dtype": "float32"}, {"name": "1798", "dtype": "float32"}, {"name": "1799", "dtype": "float32"}, {"name": "1800", "dtype": "float32"}, {"name": "1801", "dtype": "float32"}, {"name": "1802", "dtype": "float32"}, {"name": "1803", "dtype": "float32"}, {"name": "1804", "dtype": "float32"}, {"name": "1805", "dtype": "float32"}, {"name": "1806", "dtype": "float32"}, {"name": "1807", "dtype": "float32"}, {"name": "1808", "dtype": "float32"}, {"name": "1809", "dtype": "float32"}, {"name": "1810", "dtype": "float32"}, {"name": "1811", "dtype": "float32"}, {"name": "1812", "dtype": "float32"}, {"name": "1813", "dtype": "float32"}, {"name": "1814", "dtype": "float32"}, {"name": "1815", "dtype": "float32"}, {"name": "1816", "dtype": "float32"}, {"name": "1817", "dtype": "float32"}, {"name": "1818", "dtype": "float32"}, {"name": "1819", "dtype": "float32"}, {"name": "1820", "dtype": "float32"}, {"name": "1821", "dtype": "float32"}, {"name": "1822", "dtype": "float32"}, {"name": "1823", "dtype": "float32"}, {"name": "1824", "dtype": "float32"}, {"name": "1825", "dtype": "float32"}, {"name": "1826", "dtype": "float32"}, {"name": "1827", "dtype": "float32"}, {"name": "1828", "dtype": "float32"}, {"name": "1829", "dtype": "float32"}, {"name": "1830", "dtype": "float32"}, {"name": "1831", "dtype": "float32"}, {"name": "1832", "dtype": "float32"}, {"name": "1833", "dtype": "float32"}, {"name": "1834", "dtype": "float32"}, {"name": "1835", "dtype": "float32"}, {"name": "1836", "dtype": "float32"}, {"name": "1837", "dtype": "float32"}, {"name": "1838", "dtype": "float32"}, {"name": "1839", "dtype": "float32"}, {"name": "1840", "dtype": "float32"}, {"name": "1841", "dtype": "float32"}, {"name": "1842", "dtype": "float32"}, {"name": "1843", "dtype": "float32"}, {"name": "1844", "dtype": "float32"}, {"name": "1845", "dtype": "float32"}, {"name": "1846", "dtype": "float32"}, {"name": "1847", "dtype": "float32"}, {"name": "1848", "dtype": "float32"}, {"name": "1849", "dtype": "float32"}, {"name": "1850", "dtype": "float32"}, {"name": "1851", "dtype": "float32"}, {"name": "1852", "dtype": "float32"}, {"name": "1853", "dtype": "float32"}, {"name": "1854", "dtype": "float32"}, {"name": "1855", "dtype": "float32"}, {"name": "1856", "dtype": "float32"}, {"name": "1857", "dtype": "float32"}, {"name": "1858", "dtype": "float32"}, {"name": "1859", "dtype": "float32"}, {"name": "1860", "dtype": "float32"}, {"name": "1861", "dtype": "float32"}, {"name": "1862", "dtype": "float32"}, {"name": "1863", "dtype": "float32"}, {"name": "1864", "dtype": "float32"}, {"name": "1865", "dtype": "float32"}, {"name": "1866", "dtype": "float32"}, {"name": "1867", "dtype": "float32"}, {"name": "1868", "dtype": "float32"}, {"name": "1869", "dtype": "float32"}, {"name": "1870", "dtype": "float32"}, {"name": "1871", "dtype": "float32"}, {"name": "1872", "dtype": "float32"}, {"name": "1873", "dtype": "float32"}, {"name": "1874", "dtype": "float32"}, {"name": "1875", "dtype": "float32"}, {"name": "1876", "dtype": "float32"}, {"name": "1877", "dtype": "float32"}, {"name": "1878", "dtype": "float32"}, {"name": "1879", "dtype": "float32"}, {"name": "1880", "dtype": "float32"}, {"name": "1881", "dtype": "float32"}, {"name": "1882", "dtype": "float32"}, {"name": "1883", "dtype": "float32"}, {"name": "1884", "dtype": "float32"}, {"name": "1885", "dtype": "float32"}, {"name": "1886", "dtype": "float32"}, {"name": "1887", "dtype": "float32"}, {"name": "1888", "dtype": "float32"}, {"name": "1889", "dtype": "float32"}, {"name": "1890", "dtype": "float32"}, {"name": "1891", "dtype": "float32"}, {"name": "1892", "dtype": "float32"}, {"name": "1893", "dtype": "float32"}, {"name": "1894", "dtype": "float32"}, {"name": "1895", "dtype": "float32"}, {"name": "1896", "dtype": "float32"}, {"name": "1897", "dtype": "float32"}, {"name": "1898", "dtype": "float32"}, {"name": "1899", "dtype": "float32"}, {"name": "1900", "dtype": "float32"}, {"name": "1901", "dtype": "float32"}, {"name": "1902", "dtype": "float32"}, {"name": "1903", "dtype": "float32"}, {"name": "1904", "dtype": "float32"}, {"name": "1905", "dtype": "float32"}, {"name": "1906", "dtype": "float32"}, {"name": "1907", "dtype": "float32"}, {"name": "1908", "dtype": "float32"}, {"name": "1909", "dtype": "float32"}, {"name": "1910", "dtype": "float32"}, {"name": "1911", "dtype": "float32"}, {"name": "1912", "dtype": "float32"}, {"name": "1913", "dtype": "float32"}, {"name": "1914", "dtype": "float32"}, {"name": "1915", "dtype": "float32"}, {"name": "1916", "dtype": "float32"}, {"name": "1917", "dtype": "float32"}, {"name": "1918", "dtype": "float32"}, {"name": "1919", "dtype": "float32"}, {"name": "1920", "dtype": "float32"}, {"name": "1921", "dtype": "float32"}, {"name": "1922", "dtype": "float32"}, {"name": "1923", "dtype": "float32"}, {"name": "1924", "dtype": "float32"}, {"name": "1925", "dtype": "float32"}, {"name": "1926", "dtype": "float32"}, {"name": "1927", "dtype": "float32"}, {"name": "1928", "dtype": "float32"}, {"name": "1929", "dtype": "float32"}, {"name": "1930", "dtype": "float32"}, {"name": "1931", "dtype": "float32"}, {"name": "1932", "dtype": "float32"}, {"name": "1933", "dtype": "float32"}, {"name": "1934", "dtype": "float32"}, {"name": "1935", "dtype": "float32"}, {"name": "1936", "dtype": "float32"}, {"name": "1937", "dtype": "float32"}, {"name": "1938", "dtype": "float32"}, {"name": "1939", "dtype": "float32"}, {"name": "1940", "dtype": "float32"}, {"name": "1941", "dtype": "float32"}, {"name": "1942", "dtype": "float32"}, {"name": "1943", "dtype": "float32"}, {"name": "1944", "dtype": "float32"}, {"name": "1945", "dtype": "float32"}, {"name": "1946", "dtype": "float32"}, {"name": "1947", "dtype": "float32"}, {"name": "1948", "dtype": "float32"}, {"name": "1949", "dtype": "float32"}, {"name": "1950", "dtype": "float32"}, {"name": "1951", "dtype": "float32"}, {"name": "1952", "dtype": "float32"}, {"name": "1953", "dtype": "float32"}, {"name": "1954", "dtype": "float32"}, {"name": "1955", "dtype": "float32"}, {"name": "1956", "dtype": "float32"}, {"name": "1957", "dtype": "float32"}, {"name": "1958", "dtype": "float32"}, {"name": "1959", "dtype": "float32"}, {"name": "1960", "dtype": "float32"}, {"name": "1961", "dtype": "float32"}, {"name": "1962", "dtype": "float32"}, {"name": "1963", "dtype": "float32"}, {"name": "1964", "dtype": "float32"}, {"name": "1965", "dtype": "float32"}, {"name": "1966", "dtype": "float32"}, {"name": "1967", "dtype": "float32"}, {"name": "1968", "dtype": "float32"}, {"name": "1969", "dtype": "float32"}, {"name": "1970", "dtype": "float32"}, {"name": "1971", "dtype": "float32"}, {"name": "1972", "dtype": "float32"}, {"name": "1973", "dtype": "float32"}, {"name": "1974", "dtype": "float32"}, {"name": "1975", "dtype": "float32"}, {"name": "1976", "dtype": "float32"}, {"name": "1977", "dtype": "float32"}, {"name": "1978", "dtype": "float32"}, {"name": "1979", "dtype": "float32"}, {"name": "1980", "dtype": "float32"}, {"name": "1981", "dtype": "float32"}, {"name": "1982", "dtype": "float32"}, {"name": "1983", "dtype": "float32"}, {"name": "1984", "dtype": "float32"}, {"name": "1985", "dtype": "float32"}, {"name": "1986", "dtype": "float32"}, {"name": "1987", "dtype": "float32"}, {"name": "1988", "dtype": "float32"}, {"name": "1989", "dtype": "float32"}, {"name": "1990", "dtype": "float32"}, {"name": "1991", "dtype": "float32"}, {"name": "1992", "dtype": "float32"}, {"name": "1993", "dtype": "float32"}, {"name": "1994", "dtype": "float32"}, {"name": "1995", "dtype": "float32"}, {"name": "1996", "dtype": "float32"}, {"name": "1997", "dtype": "float32"}, {"name": "1998", "dtype": "float32"}, {"name": "1999", "dtype": "float32"}, {"name": "2000", "dtype": "float32"}, {"name": "2001", "dtype": "float32"}, {"name": "2002", "dtype": "float32"}, {"name": "2003", "dtype": "float32"}, {"name": "2004", "dtype": "float32"}, {"name": "2005", "dtype": "float32"}, {"name": "2006", "dtype": "float32"}, {"name": "2007", "dtype": "float32"}, {"name": "2008", "dtype": "float32"}, {"name": "2009", "dtype": "float32"}, {"name": "2010", "dtype": "float32"}, {"name": "2011", "dtype": "float32"}, {"name": "2012", "dtype": "float32"}, {"name": "2013", "dtype": "float32"}, {"name": "2014", "dtype": "float32"}, {"name": "2015", "dtype": "float32"}, {"name": "2016", "dtype": "float32"}, {"name": "2017", "dtype": "float32"}, {"name": "2018", "dtype": "float32"}, {"name": "2019", "dtype": "float32"}, {"name": "2020", "dtype": "float32"}, {"name": "2021", "dtype": "float32"}, {"name": "2022", "dtype": "float32"}, {"name": "2023", "dtype": "float32"}, {"name": "2024", "dtype": "float32"}, {"name": "2025", "dtype": "float32"}, {"name": "2026", "dtype": "float32"}, {"name": "2027", "dtype": "float32"}, {"name": "2028", "dtype": "float32"}, {"name": "2029", "dtype": "float32"}, {"name": "2030", "dtype": "float32"}, {"name": "2031", "dtype": "float32"}, {"name": "2032", "dtype": "float32"}, {"name": "2033", "dtype": "float32"}, {"name": "2034", "dtype": "float32"}, {"name": "2035", "dtype": "float32"}, {"name": "2036", "dtype": "float32"}, {"name": "2037", "dtype": "float32"}, {"name": "2038", "dtype": "float32"}, {"name": "2039", "dtype": "float32"}, {"name": "2040", "dtype": "float32"}, {"name": "2041", "dtype": "float32"}, {"name": "2042", "dtype": "float32"}, {"name": "2043", "dtype": "float32"}, {"name": "2044", "dtype": "float32"}, {"name": "2045", "dtype": "float32"}, {"name": "2046", "dtype": "float32"}, {"name": "2047", "dtype": "float32"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 307608907.5, "num_examples": 37500}, {"name": "test", "num_bytes": 102536305.0, "num_examples": 12500}], "download_size": 565388883, "dataset_size": 410145212.5}}
|
2023-08-23T06:21:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "PKDD_GPTNEO_Finetuned"
More Information needed
|
[
"# Dataset Card for \"PKDD_GPTNEO_Finetuned\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"PKDD_GPTNEO_Finetuned\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"PKDD_GPTNEO_Finetuned\"\n\nMore Information needed"
] |
704b7bbfa0df910eedce92b322858dcac15cc0a4
|
# Dataset of yasaka_kanako/八坂神奈子/야사카카나코 (Touhou)
This is the dataset of yasaka_kanako/八坂神奈子/야사카카나코 (Touhou), containing 500 images and their tags.
The core tags of this character are `short_hair, purple_hair, red_eyes, hair_ornament, breasts, leaf_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 516.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yasaka_kanako_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 352.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yasaka_kanako_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 986 | 644.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yasaka_kanako_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 480.08 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yasaka_kanako_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 986 | 825.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yasaka_kanako_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yasaka_kanako_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 |  |  |  |  |  | 1girl, mirror, shimenawa, solo, leaf, smile, shide, onbashira, sandals |
| 1 | 5 |  |  |  |  |  | 1girl, sakazuki, sake, shimenawa, solo, mirror, shide, blue_hair, sandals, smile |
| 2 | 5 |  |  |  |  |  | 1girl, bangs, black_skirt, mirror, red_shirt, shimenawa, short_over_long_sleeves, solo, looking_at_viewer, shide, simple_background, smile, large_breasts, open_mouth, puffy_sleeves, white_background, brown_skirt, closed_mouth, cowboy_shot, long_skirt |
| 3 | 6 |  |  |  |  |  | 1girl, blush, huge_breasts, nipples, rope, solo, sweat, mosaic_censoring, bangs, purple_eyes, sitting, thick_thighs, looking_at_viewer, navel, pussy_juice, socks, spread_legs |
| 4 | 7 |  |  |  |  |  | 1girl, cleavage, solo, large_breasts, navel, looking_at_viewer, red_bikini, leaf, white_bikini |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | mirror | shimenawa | solo | leaf | smile | shide | onbashira | sandals | sakazuki | sake | blue_hair | bangs | black_skirt | red_shirt | short_over_long_sleeves | looking_at_viewer | simple_background | large_breasts | open_mouth | puffy_sleeves | white_background | brown_skirt | closed_mouth | cowboy_shot | long_skirt | blush | huge_breasts | nipples | rope | sweat | mosaic_censoring | purple_eyes | sitting | thick_thighs | navel | pussy_juice | socks | spread_legs | cleavage | red_bikini | white_bikini |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:------------|:-------|:-------|:--------|:--------|:------------|:----------|:-----------|:-------|:------------|:--------|:--------------|:------------|:--------------------------|:--------------------|:--------------------|:----------------|:-------------|:----------------|:-------------------|:--------------|:---------------|:--------------|:-------------|:--------|:---------------|:----------|:-------|:--------|:-------------------|:--------------|:----------|:---------------|:--------|:--------------|:--------|:--------------|:-----------|:-------------|:---------------|
| 0 | 18 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | X | X | X | | X | X | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | | X | | | | | | | | | X | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | |
| 4 | 7 |  |  |  |  |  | X | | | X | X | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | X | | | | X | X | X |
|
CyberHarem/yasaka_kanako_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T03:12:59+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T15:05:11+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of yasaka\_kanako/八坂神奈子/야사카카나코 (Touhou)
===============================================
This is the dataset of yasaka\_kanako/八坂神奈子/야사카카나코 (Touhou), containing 500 images and their tags.
The core tags of this character are 'short\_hair, purple\_hair, red\_eyes, hair\_ornament, breasts, leaf\_hair\_ornament', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
fd603bc93a20a0f67ebf0b573e62d23a236565b5
|
# Dataset Card for "cot_explanation_targets_mosaicml-mpt-7b-8k-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
LahiruLowe/cot_explanation_targets_mosaicml-mpt-7b-8k-instruct
|
[
"region:us"
] |
2023-08-18T03:29:04+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "template_type", "dtype": "string"}, {"name": "explained_targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15709, "num_examples": 35}], "download_size": 13037, "dataset_size": 15709}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-18T03:44:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "cot_explanation_targets_mosaicml-mpt-7b-8k-instruct"
More Information needed
|
[
"# Dataset Card for \"cot_explanation_targets_mosaicml-mpt-7b-8k-instruct\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"cot_explanation_targets_mosaicml-mpt-7b-8k-instruct\"\n\nMore Information needed"
] |
[
6,
34
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"cot_explanation_targets_mosaicml-mpt-7b-8k-instruct\"\n\nMore Information needed"
] |
49c644010f0ee0da68e1f0809255a97ec6348e27
|
# Dataset Card for "t0_explanation_targets_mosaicml-mpt-7b-8k-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
LahiruLowe/t0_explanation_targets_mosaicml-mpt-7b-8k-instruct
|
[
"region:us"
] |
2023-08-18T03:30:18+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "template_type", "dtype": "string"}, {"name": "explained_targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 116123, "num_examples": 77}], "download_size": 51066, "dataset_size": 116123}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-18T03:47:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "t0_explanation_targets_mosaicml-mpt-7b-8k-instruct"
More Information needed
|
[
"# Dataset Card for \"t0_explanation_targets_mosaicml-mpt-7b-8k-instruct\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"t0_explanation_targets_mosaicml-mpt-7b-8k-instruct\"\n\nMore Information needed"
] |
[
6,
35
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"t0_explanation_targets_mosaicml-mpt-7b-8k-instruct\"\n\nMore Information needed"
] |
9010eab9967f18331ed089e59b9903eaf7a6eee8
|
# Dataset of daiyousei/大妖精/대요정 (Touhou)
This is the dataset of daiyousei/大妖精/대요정 (Touhou), containing 500 images and their tags.
The core tags of this character are `green_hair, side_ponytail, wings, fairy_wings, ribbon, bow, green_eyes, hair_bow, hair_ribbon, short_hair, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 504.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daiyousei_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 329.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daiyousei_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1067 | 655.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daiyousei_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 463.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daiyousei_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1067 | 857.55 MiB | [Download](https://huggingface.co/datasets/CyberHarem/daiyousei_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/daiyousei_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 15 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_shirt, puffy_short_sleeves, smile, yellow_ascot, bangs, blue_skirt, blush, closed_mouth, blue_vest, collared_shirt, fairy, full_body, white_socks, simple_background, skirt_set, white_background, yellow_bow, blue_eyes, hair_between_eyes, long_hair, mary_janes |
| 1 | 14 |  |  |  |  |  | 1girl, short_sleeves, solo, puffy_sleeves, shirt, ascot, blush, open_mouth, looking_at_viewer, skirt_set, smile, vest, blue_eyes |
| 2 | 9 |  |  |  |  |  | 1girl, solo, ascot, blush, smile |
| 3 | 5 |  |  |  |  |  | 1girl, solo, blue_eyes, flower, smile, barefoot, dress |
| 4 | 8 |  |  |  |  |  | 2girls, blue_hair, blush, dress, blue_eyes, open_mouth, ascot, one_eye_closed, short_sleeves |
| 5 | 5 |  |  |  |  |  | 1girl, blush, flat_chest, loli, nipples, nude, solo, looking_at_viewer, navel, pussy, yellow_eyes, barefoot, lying, open_mouth |
| 6 | 6 |  |  |  |  |  | 1girl, hetero, sex, solo_focus, vaginal, 1boy, blush, mosaic_censoring, nipples, open_mouth, penis, pussy, tears, blue_eyes, navel, small_breasts |
| 7 | 8 |  |  |  |  |  | 2girls, blush, nipples, solo_focus, smile, flat_chest, navel, yuri, completely_nude, small_breasts, closed_mouth, loli, pussy, collarbone, holding_hands, standing, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_shirt | puffy_short_sleeves | smile | yellow_ascot | bangs | blue_skirt | blush | closed_mouth | blue_vest | collared_shirt | fairy | full_body | white_socks | simple_background | skirt_set | white_background | yellow_bow | blue_eyes | hair_between_eyes | long_hair | mary_janes | short_sleeves | puffy_sleeves | shirt | ascot | open_mouth | vest | flower | barefoot | dress | 2girls | blue_hair | one_eye_closed | flat_chest | loli | nipples | nude | navel | pussy | yellow_eyes | lying | hetero | sex | solo_focus | vaginal | 1boy | mosaic_censoring | penis | tears | small_breasts | yuri | completely_nude | collarbone | holding_hands | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:----------------------|:--------|:---------------|:--------|:-------------|:--------|:---------------|:------------|:-----------------|:--------|:------------|:--------------|:--------------------|:------------|:-------------------|:-------------|:------------|:--------------------|:------------|:-------------|:----------------|:----------------|:--------|:--------|:-------------|:-------|:---------|:-----------|:--------|:---------|:------------|:-----------------|:-------------|:-------|:----------|:-------|:--------|:--------|:--------------|:--------|:---------|:------|:-------------|:----------|:-------|:-------------------|:--------|:--------|:----------------|:-------|:------------------|:-------------|:----------------|:-----------|
| 0 | 15 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | | | X | | | | X | | | | | | | | X | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | | X | | | X | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | X | | | X | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 8 |  |  |  |  |  | | | | | | | | | | X | | | | | | | | | | | X | | | | X | | | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | | | | | | X | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | X | | X | X | | | X | X | X | X | X | X | X | X | X | | | | | |
| 7 | 8 |  |  |  |  |  | | | | | | X | | | | X | X | | | | | | | | X | | | | | | | | | | | | | | | X | | | X | X | X | | X | X | | | | | X | | | | | | X | X | X | X | X | X |
|
CyberHarem/daiyousei_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T03:40:47+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T13:59:05+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of daiyousei/大妖精/대요정 (Touhou)
=====================================
This is the dataset of daiyousei/大妖精/대요정 (Touhou), containing 500 images and their tags.
The core tags of this character are 'green\_hair, side\_ponytail, wings, fairy\_wings, ribbon, bow, green\_eyes, hair\_bow, hair\_ribbon, short\_hair, breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
1b82f9e51cb75e9c386807676b0695de76fa9f22
|
# Dataset of shiki_eiki/四季映姫・ヤマザナドゥ/四季映姫/시키에이키야마자나두 (Touhou)
This is the dataset of shiki_eiki/四季映姫・ヤマザナドゥ/四季映姫/시키에이키야마자나두 (Touhou), containing 500 images and their tags.
The core tags of this character are `green_hair, hat, short_hair, blue_eyes, ribbon`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 601.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 366.46 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1145 | 750.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 543.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1145 | 1018.10 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiki_eiki_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shiki_eiki_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, blue_vest, long_sleeves, solo, white_shirt, frilled_hat, looking_at_viewer, simple_background, bangs, epaulettes, holding, rod_of_remorse, white_background, blue_headwear, black_skirt, blush, bow, open_mouth, upper_body, closed_mouth |
| 1 | 10 |  |  |  |  |  | 1girl, black_footwear, black_skirt, blue_vest, full_body, long_sleeves, ribbon-trimmed_skirt, rod_of_remorse, solo, white_shirt, asymmetrical_hair, bangs, blue_headwear, frilled_hat, holding, white_socks, closed_mouth, looking_at_viewer, red_bow, epaulettes, white_ribbon, footwear_bow, red_ribbon, standing, white_background, white_bow, simple_background, buttons, green_eyes, mary_janes |
| 2 | 5 |  |  |  |  |  | 1girl, bangs, black_skirt, blue_vest, cowboy_shot, juliet_sleeves, looking_at_viewer, red_ribbon, ribbon-trimmed_skirt, rod_of_remorse, solo, wide_sleeves, blush, closed_mouth, epaulettes, holding, white_shirt, frilled_hat, frilled_skirt, hair_between_eyes, red_bow, smile, white_ribbon, black_background, blue_headwear, green_eyes, hat_ribbon, spider_lily, standing |
| 3 | 5 |  |  |  |  |  | 1girl, hat_ribbon, shirt, solo, vest, looking_at_viewer, rod_of_remorse, skirt, juliet_sleeves, spider_lily, open_mouth, petals, wide_sleeves |
| 4 | 7 |  |  |  |  |  | 1girl, black_thighhighs, rod_of_remorse, skirt, solo, wide_sleeves, hat_ribbon, long_sleeves, zettai_ryouiki, vest, green_eyes |
| 5 | 9 |  |  |  |  |  | 1girl, solo, rod_of_remorse, upper_body, blush, looking_at_viewer |
| 6 | 5 |  |  |  |  |  | 1girl, adapted_costume, black_thighhighs, detached_sleeves, skirt, solo, alternate_costume, bare_shoulders, blush, bow, looking_at_viewer, magical_girl, rod_of_remorse, smile, zettai_ryouiki, asymmetrical_hair, hat_ribbon, open_mouth, boots, frills |
| 7 | 5 |  |  |  |  |  | 1boy, 1girl, blush, hetero, mosaic_censoring, open_mouth, penis, sex, solo_focus, vaginal, bangs, blue_vest, cum_in_pussy, frilled_hat, nipples, on_back, white_shirt, blue_headwear, feet_out_of_frame, long_sleeves, looking_at_viewer, missionary, alternate_breast_size, black_skirt, bow, breast_grab, clothing_aside, cum_on_breasts, grabbing, hair_between_eyes, huge_breasts, navel, panties, pov, spread_legs |
| 8 | 11 |  |  |  |  |  | 1girl, hetero, open_mouth, penis, sex, solo_focus, vaginal, nipples, 1boy, blush, cum_in_pussy, cowgirl_position, girl_on_top, navel, nude, bangs, flat_chest, frilled_hat, mosaic_censoring, sweat, bar_censor, hair_between_eyes, heart, large_breasts, looking_at_viewer, tears, thighhighs |
| 9 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, nipples, pussy, solo, standing, bangs, frilled_hat, open_mouth, simple_background, small_breasts, white_background, ass_visible_through_thighs, censored, completely_nude, asymmetrical_hair, collarbone, cowboy_shot, green_eyes, hair_between_eyes, red_ribbon, thigh_gap, white_ribbon |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_vest | long_sleeves | solo | white_shirt | frilled_hat | looking_at_viewer | simple_background | bangs | epaulettes | holding | rod_of_remorse | white_background | blue_headwear | black_skirt | blush | bow | open_mouth | upper_body | closed_mouth | black_footwear | full_body | ribbon-trimmed_skirt | asymmetrical_hair | white_socks | red_bow | white_ribbon | footwear_bow | red_ribbon | standing | white_bow | buttons | green_eyes | mary_janes | cowboy_shot | juliet_sleeves | wide_sleeves | frilled_skirt | hair_between_eyes | smile | black_background | hat_ribbon | spider_lily | shirt | vest | skirt | petals | black_thighhighs | zettai_ryouiki | adapted_costume | detached_sleeves | alternate_costume | bare_shoulders | magical_girl | boots | frills | 1boy | hetero | mosaic_censoring | penis | sex | solo_focus | vaginal | cum_in_pussy | nipples | on_back | feet_out_of_frame | missionary | alternate_breast_size | breast_grab | clothing_aside | cum_on_breasts | grabbing | huge_breasts | navel | panties | pov | spread_legs | cowgirl_position | girl_on_top | nude | flat_chest | sweat | bar_censor | heart | large_breasts | tears | thighhighs | pussy | small_breasts | ass_visible_through_thighs | censored | completely_nude | collarbone | thigh_gap |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:---------------|:-------|:--------------|:--------------|:--------------------|:--------------------|:--------|:-------------|:----------|:-----------------|:-------------------|:----------------|:--------------|:--------|:------|:-------------|:-------------|:---------------|:-----------------|:------------|:-----------------------|:--------------------|:--------------|:----------|:---------------|:---------------|:-------------|:-----------|:------------|:----------|:-------------|:-------------|:--------------|:-----------------|:---------------|:----------------|:--------------------|:--------|:-------------------|:-------------|:--------------|:--------|:-------|:--------|:---------|:-------------------|:-----------------|:------------------|:-------------------|:--------------------|:-----------------|:---------------|:--------|:---------|:-------|:---------|:-------------------|:--------|:------|:-------------|:----------|:---------------|:----------|:----------|:--------------------|:-------------|:------------------------|:--------------|:-----------------|:-----------------|:-----------|:---------------|:--------|:----------|:------|:--------------|:-------------------|:--------------|:-------|:-------------|:--------|:-------------|:--------|:----------------|:--------|:-------------|:--------|:----------------|:-----------------------------|:-----------|:------------------|:-------------|:------------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | | X | X | X | X | | X | X | X | X | | X | X | X | | | | X | | | X | | | X | X | | X | X | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | X | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | | X | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | | | X | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 9 |  |  |  |  |  | X | | | X | | | X | | | | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | X | | | X | | | | | X | | | | X | X | X | | | | | | X | | | | | | | | | | | | | | | | X | | X | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | X | X | | X | X | X | | X | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 8 | 11 |  |  |  |  |  | X | | | | | X | X | | X | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | | | X | | X | X | X | X | | | | X | | | X | | X | | | | | | X | | | X | | X | X | | | X | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
CyberHarem/shiki_eiki_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T03:59:40+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T15:46:56+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of shiki\_eiki/四季映姫・ヤマザナドゥ/四季映姫/시키에이키야마자나두 (Touhou)
===========================================================
This is the dataset of shiki\_eiki/四季映姫・ヤマザナドゥ/四季映姫/시키에이키야마자나두 (Touhou), containing 500 images and their tags.
The core tags of this character are 'green\_hair, hat, short\_hair, blue\_eyes, ribbon', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
2d0f692b7d4699edc1fc62fcdf8aa97184b87aec
|
# Dataset of letty_whiterock/レティホワイトロック/레티화이트락 (Touhou)
This is the dataset of letty_whiterock/レティホワイトロック/레티화이트락 (Touhou), containing 500 images and their tags.
The core tags of this character are `hat, short_hair, light_purple_hair, white_headwear, breasts, blue_eyes, purple_eyes, bangs, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 565.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/letty_whiterock_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 352.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/letty_whiterock_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1097 | 706.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/letty_whiterock_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 513.84 MiB | [Download](https://huggingface.co/datasets/CyberHarem/letty_whiterock_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1097 | 950.09 MiB | [Download](https://huggingface.co/datasets/CyberHarem/letty_whiterock_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/letty_whiterock_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 14 |  |  |  |  |  | 1girl, blue_skirt, blue_vest, simple_background, solo, waist_apron, looking_at_viewer, smile, juliet_sleeves, white_apron, white_shirt, white_scarf, white_background, closed_mouth, snowflakes, blush, open_mouth |
| 1 | 10 |  |  |  |  |  | 1girl, blue_background, blue_skirt, blush, cowboy_shot, frilled_apron, looking_at_viewer, medium_hair, mob_cap, solo, waist_apron, white_apron, white_scarf, white_shirt, blue_vest, frilled_sleeves, hair_between_eyes, marker_(medium), sample_watermark, juliet_sleeves, snowflake_background, snowflakes, frilled_skirt, open_mouth, bow, frilled_hat, head_tilt, ribbon, :d, brooch, closed_mouth, wide_sleeves |
| 2 | 5 |  |  |  |  |  | 1girl, shirt, solo, vest, waist_apron, white_scarf, juliet_sleeves, smile, looking_at_viewer, blue_dress, closed_eyes, skirt_set |
| 3 | 5 |  |  |  |  |  | 1girl, blue_skirt, blue_vest, closed_mouth, full_body, smile, solo, waist_apron, white_apron, blush, frills, juliet_sleeves, looking_at_viewer, white_socks, simple_background, white_background, white_shirt, hair_between_eyes, white_bloomers |
| 4 | 5 |  |  |  |  |  | 1girl, blush, looking_at_viewer, smile, solo, upper_body, blue_vest, closed_mouth, hair_between_eyes, white_scarf, apron, juliet_sleeves, large_breasts |
| 5 | 7 |  |  |  |  |  | 1girl, solo, blush, long_sleeves, scarf, smile, apron |
| 6 | 9 |  |  |  |  |  | 1girl, boots, solo, apron, smile, bloomers, scarf |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blue_skirt | blue_vest | simple_background | solo | waist_apron | looking_at_viewer | smile | juliet_sleeves | white_apron | white_shirt | white_scarf | white_background | closed_mouth | snowflakes | blush | open_mouth | blue_background | cowboy_shot | frilled_apron | medium_hair | mob_cap | frilled_sleeves | hair_between_eyes | marker_(medium) | sample_watermark | snowflake_background | frilled_skirt | bow | frilled_hat | head_tilt | ribbon | :d | brooch | wide_sleeves | shirt | vest | blue_dress | closed_eyes | skirt_set | full_body | frills | white_socks | white_bloomers | upper_body | apron | large_breasts | long_sleeves | scarf | boots | bloomers |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:------------|:--------------------|:-------|:--------------|:--------------------|:--------|:-----------------|:--------------|:--------------|:--------------|:-------------------|:---------------|:-------------|:--------|:-------------|:------------------|:--------------|:----------------|:--------------|:----------|:------------------|:--------------------|:------------------|:-------------------|:-----------------------|:----------------|:------|:--------------|:------------|:---------|:-----|:---------|:---------------|:--------|:-------|:-------------|:--------------|:------------|:------------|:---------|:--------------|:-----------------|:-------------|:--------|:----------------|:---------------|:--------|:--------|:-----------|
| 0 | 14 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | | X | X | X | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | | | | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | X | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | X | X | X | X | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | X | | X | X | X | | | X | | X | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | |
| 5 | 7 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | X | | |
| 6 | 9 |  |  |  |  |  | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | X | X |
|
CyberHarem/letty_whiterock_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T04:10:20+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-15T07:04:39+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of letty\_whiterock/レティホワイトロック/레티화이트락 (Touhou)
======================================================
This is the dataset of letty\_whiterock/レティホワイトロック/레티화이트락 (Touhou), containing 500 images and their tags.
The core tags of this character are 'hat, short\_hair, light\_purple\_hair, white\_headwear, breasts, blue\_eyes, purple\_eyes, bangs, purple\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
86206d4c32777d35a4a397c3b79dade8d45b2872
|
# Dataset Card for "formatte_multi_nli"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Andyrasika/formatted_multi_nli
|
[
"region:us"
] |
2023-08-18T04:39:14+00:00
|
{"dataset_info": {"features": [{"name": "premise", "dtype": "string"}, {"name": "hypothesis", "dtype": "string"}, {"name": "genre", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 8524829.921925532, "num_examples": 45000}], "download_size": 5761464, "dataset_size": 8524829.921925532}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-08-18T04:39:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "formatte_multi_nli"
More Information needed
|
[
"# Dataset Card for \"formatte_multi_nli\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"formatte_multi_nli\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"formatte_multi_nli\"\n\nMore Information needed"
] |
e42660a6c87ca29a3f33d10ea6e2c2442703f7a8
|
# Dataset of nagae_iku/永江衣玖/나가에이쿠 (Touhou)
This is the dataset of nagae_iku/永江衣玖/나가에이쿠 (Touhou), containing 500 images and their tags.
The core tags of this character are `short_hair, hat, red_eyes, purple_hair, ribbon, bow, hat_ribbon, blue_hair, hat_bow`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 599.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 390.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1079 | 734.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 553.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1079 | 959.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/nagae_iku_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/nagae_iku_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, capelet, frills, shawl, smile, solo |
| 1 | 25 |  |  |  |  |  | 1girl, frills, shawl, solo, capelet, electricity, skirt, smile |
| 2 | 33 |  |  |  |  |  | 1girl, black_headwear, solo, white_shirt, hagoromo, long_sleeves, black_skirt, looking_at_viewer, smile, bangs, frilled_capelet, red_ribbon, closed_mouth, red_bow, white_capelet, blush, ascot, hair_between_eyes |
| 3 | 5 |  |  |  |  |  | 1girl, blush, large_breasts, solo, looking_at_viewer, nipples, shawl, upper_body, capelet, open_mouth, shirt |
| 4 | 6 |  |  |  |  |  | 2girls, peach, shawl, frills, long_hair |
| 5 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, penis, bangs, large_breasts, paizuri, fellatio, frills, heart, huge_breasts, nipples, nude, pov, upper_body, bar_censor, black_headwear, cum_on_breasts, looking_at_viewer, simple_background, smile, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | capelet | frills | shawl | smile | solo | electricity | skirt | black_headwear | white_shirt | hagoromo | long_sleeves | black_skirt | looking_at_viewer | bangs | frilled_capelet | red_ribbon | closed_mouth | red_bow | white_capelet | blush | ascot | hair_between_eyes | large_breasts | nipples | upper_body | open_mouth | shirt | 2girls | peach | long_hair | 1boy | hetero | solo_focus | penis | paizuri | fellatio | heart | huge_breasts | nude | pov | bar_censor | cum_on_breasts | simple_background | white_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:----------|:---------|:--------|:--------|:-------|:--------------|:--------|:-----------------|:--------------|:-----------|:---------------|:--------------|:--------------------|:--------|:------------------|:-------------|:---------------|:----------|:----------------|:--------|:--------|:--------------------|:----------------|:----------|:-------------|:-------------|:--------|:---------|:--------|:------------|:-------|:---------|:-------------|:--------|:----------|:-----------|:--------|:---------------|:-------|:------|:-------------|:-----------------|:--------------------|:-------------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 33 |  |  |  |  |  | X | | | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | X | | | | | | | | X | | | | | | | X | | | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | X | | X | | | | X | | | | | X | X | | | | | | X | | | X | X | X | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/nagae_iku_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T04:43:25+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T16:05:57+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of nagae\_iku/永江衣玖/나가에이쿠 (Touhou)
=========================================
This is the dataset of nagae\_iku/永江衣玖/나가에이쿠 (Touhou), containing 500 images and their tags.
The core tags of this character are 'short\_hair, hat, red\_eyes, purple\_hair, ribbon, bow, hat\_ribbon, blue\_hair, hat\_bow', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
9c5918148f42ba067854e17754bd590c59462241
|
# Dataset Card for "fw_num_train_1000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
tyzhu/fw_num_train_1000_eval_100
|
[
"region:us"
] |
2023-08-18T04:48:27+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 137382, "num_examples": 2100}, {"name": "eval_find_word", "num_bytes": 4723, "num_examples": 100}], "download_size": 58570, "dataset_size": 142105}}
|
2023-08-18T04:48:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "fw_num_train_1000_eval_100"
More Information needed
|
[
"# Dataset Card for \"fw_num_train_1000_eval_100\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"fw_num_train_1000_eval_100\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"fw_num_train_1000_eval_100\"\n\nMore Information needed"
] |
dd5010965ae708fbfa460d32351383648c96d69c
|
# Dataset of hoshiguma_yuugi/星熊勇儀/호시구마유기 (Touhou)
This is the dataset of hoshiguma_yuugi/星熊勇儀/호시구마유기 (Touhou), containing 500 images and their tags.
The core tags of this character are `blonde_hair, horns, single_horn, long_hair, red_eyes, breasts, large_breasts, pointy_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 679.91 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 382.66 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1182 | 774.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 604.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1182 | 1.07 GiB | [Download](https://huggingface.co/datasets/CyberHarem/hoshiguma_yuugi_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hoshiguma_yuugi_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, sakazuki, solo, chain, cuffs, sake, smile, geta, sitting |
| 1 | 5 |  |  |  |  |  | 1girl, chain, shirt, skirt, solo, shackles, grin, looking_at_viewer, short_sleeves |
| 2 | 12 |  |  |  |  |  | 1girl, shackles, short_sleeves, solo, star_(symbol), white_shirt, sakazuki, blue_skirt, chain, geta, looking_at_viewer, full_body, holding_cup, simple_background, white_background, red_horns, see-through, clenched_hand, grin, navel, sake, striped_skirt |
| 3 | 6 |  |  |  |  |  | 1girl, looking_at_viewer, smile, solo, simple_background, upper_body, white_background, white_shirt |
| 4 | 9 |  |  |  |  |  | 1girl, muscular_female, solo, nipples, nude, abs, huge_breasts, looking_at_viewer, navel, obliques, thick_thighs, grin, shackles |
| 5 | 22 |  |  |  |  |  | 1girl, futanari, huge_penis, nipples, testicles, abs, large_penis, solo, uncensored, muscular_female, looking_at_viewer, erection, huge_breasts, navel, very_long_hair, artist_name, blush, collarbone, completely_nude, oni, teeth, thick_thighs, veiny_penis, open_mouth, red_horns, simple_background, grin, steam, sweat, wet |
| 6 | 8 |  |  |  |  |  | 1boy, 1girl, hetero, solo_focus, blush, paizuri, penis, huge_breasts, nipples, pov, looking_at_viewer, mosaic_censoring, smile, cuffs, cum_on_breasts, fellatio, nude, shirt_lift, sweat, uncensored |
| 7 | 5 |  |  |  |  |  | 1girl, cleavage, fake_animal_ears, playboy_bunny, rabbit_ears, solo, detached_collar, looking_at_viewer, rabbit_tail, alternate_costume, bare_shoulders, black_leotard, fake_tail, grin, ponytail, red_bowtie, armpits, arms_up, bangs, brown_pantyhose, chain, collarbone, covered_navel, red_horns, shackles, simple_background, sitting, strapless_leotard, very_long_hair, white_background |
| 8 | 6 |  |  |  |  |  | 1girl, blush, nipples, solo, nude, anus, cum_in_pussy, spread_legs, bar_censor, cumdrip, open_mouth, spread_pussy |
| 9 | 5 |  |  |  |  |  | 1girl, enmaided, looking_at_viewer, maid_apron, maid_headdress, solo, white_apron, frilled_apron, waist_apron, bangs, blue_dress, chain, frilled_dress, full_body, holding, mary_janes, puffy_short_sleeves, shackles, twin_braids, white_thighhighs, back_bow, blue_footwear, bowtie, cleavage, closed_mouth, neck_ribbon, red_horns, sakazuki, simple_background, sitting, star_(symbol), white_background, white_bow |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | sakazuki | solo | chain | cuffs | sake | smile | geta | sitting | shirt | skirt | shackles | grin | looking_at_viewer | short_sleeves | star_(symbol) | white_shirt | blue_skirt | full_body | holding_cup | simple_background | white_background | red_horns | see-through | clenched_hand | navel | striped_skirt | upper_body | muscular_female | nipples | nude | abs | huge_breasts | obliques | thick_thighs | futanari | huge_penis | testicles | large_penis | uncensored | erection | very_long_hair | artist_name | blush | collarbone | completely_nude | oni | teeth | veiny_penis | open_mouth | steam | sweat | wet | 1boy | hetero | solo_focus | paizuri | penis | pov | mosaic_censoring | cum_on_breasts | fellatio | shirt_lift | cleavage | fake_animal_ears | playboy_bunny | rabbit_ears | detached_collar | rabbit_tail | alternate_costume | bare_shoulders | black_leotard | fake_tail | ponytail | red_bowtie | armpits | arms_up | bangs | brown_pantyhose | covered_navel | strapless_leotard | anus | cum_in_pussy | spread_legs | bar_censor | cumdrip | spread_pussy | enmaided | maid_apron | maid_headdress | white_apron | frilled_apron | waist_apron | blue_dress | frilled_dress | holding | mary_janes | puffy_short_sleeves | twin_braids | white_thighhighs | back_bow | blue_footwear | bowtie | closed_mouth | neck_ribbon | white_bow |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-------|:--------|:--------|:-------|:--------|:-------|:----------|:--------|:--------|:-----------|:-------|:--------------------|:----------------|:----------------|:--------------|:-------------|:------------|:--------------|:--------------------|:-------------------|:------------|:--------------|:----------------|:--------|:----------------|:-------------|:------------------|:----------|:-------|:------|:---------------|:-----------|:---------------|:-----------|:-------------|:------------|:--------------|:-------------|:-----------|:-----------------|:--------------|:--------|:-------------|:------------------|:------|:--------|:--------------|:-------------|:--------|:--------|:------|:-------|:---------|:-------------|:----------|:--------|:------|:-------------------|:-----------------|:-----------|:-------------|:-----------|:-------------------|:----------------|:--------------|:------------------|:--------------|:--------------------|:-----------------|:----------------|:------------|:-----------|:-------------|:----------|:----------|:--------|:------------------|:----------------|:--------------------|:-------|:---------------|:--------------|:-------------|:----------|:---------------|:-----------|:-------------|:-----------------|:--------------|:----------------|:--------------|:-------------|:----------------|:----------|:-------------|:----------------------|:--------------|:-------------------|:-----------|:----------------|:---------|:---------------|:--------------|:------------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 5 |  |  |  |  |  | X | | X | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 12 |  |  |  |  |  | X | X | X | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | | X | | | | | | | X | | | X | | | | X | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | | X | | | | | | | | | X | X | X | | | | | | | | | | | | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 22 |  |  |  |  |  | X | | X | | | | | | | | | | X | X | | | | | | | X | | X | | | X | | | X | X | | X | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | X | X | | X | | | | | | | X | | | | X | | | | | | | | X | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 5 |  |  |  |  |  | X | | X | X | | | | | X | | | X | X | X | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | X | X | | | | | X | | | X | | X | | X | | | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/hoshiguma_yuugi_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T04:57:57+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T16:30:50+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of hoshiguma\_yuugi/星熊勇儀/호시구마유기 (Touhou)
================================================
This is the dataset of hoshiguma\_yuugi/星熊勇儀/호시구마유기 (Touhou), containing 500 images and their tags.
The core tags of this character are 'blonde\_hair, horns, single\_horn, long\_hair, red\_eyes, breasts, large\_breasts, pointy\_ears', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
19994bd0f94f99993856c61426bd16635ae9b20c
|

2023 Fake or Real: AI-generated Image Discrimination Competition dataset is now available on Hugging Face!
---
Hello🖐️
We are excited to announce the release of the dataset for the 2023 Fake or Real: AI-generated Image Discrimination Competition. The competition was held on AI CONNECT(https://aiconnect.kr/) from June 26th to July 6th, 2023, with 768 participants.
If you're interested in evaluating the performance of your model on the test dataset, we encourage you to visit the [competition page](https://aiconnect.kr/competition/detail/227/task/295/taskInfo) on AI CONNECT and submit your results. Please note that it supports only Korean yet. Of course we data scientists can always use Chrome translate, and/or even better translation models🥳. Plus, multilingual service will be provided in the (hopefully near) future, so please stay tuned!
# Background
As the advancement of generative AI technology has enabled the easy creation of indistinguishable fake information from genuine content, concerns regarding its misuse have surfaced. Image generation AI, in particular, has raised significant alarm due to its potential risks such as identity theft, revenge porn, and political manipulation. In response, it has become imperative to develop technologies that can effectively discern between real and AI-generated fake images.
The training dataset consists of diffusiondb (https://huggingface.co/datasets/poloclub/diffusiondb) and Flickr images, with the inclusion of some low-quality fake images. For the test dataset, we took measures to construct it in a manner that closely resembles real-world scenarios involving image misuse. We utilized multiple generative AI models, fine-tuned on diverse photorealistic datasets, and applied negative prompt keywords like 'cartoon' and 'too many fingers' to generate realistic images.
We hope this dataset encourages the development of robust solutions and stimulates discussions on tackling the challenges associated with AI-generated fake images.
Best Regards,
AI CONNECT
|
mncai/Fake_or_Real_Competition_Dataset
|
[
"task_categories:image-classification",
"language:en",
"license:apache-2.0",
"region:us"
] |
2023-08-18T05:03:31+00:00
|
{"language": ["en"], "license": "apache-2.0", "task_categories": ["image-classification"], "pretty_name": "aiconnect_fake_or_real"}
|
2023-08-28T00:48:27+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-image-classification #language-English #license-apache-2.0 #region-us
|
 future, so please stay tuned!
# Background
As the advancement of generative AI technology has enabled the easy creation of indistinguishable fake information from genuine content, concerns regarding its misuse have surfaced. Image generation AI, in particular, has raised significant alarm due to its potential risks such as identity theft, revenge porn, and political manipulation. In response, it has become imperative to develop technologies that can effectively discern between real and AI-generated fake images.
The training dataset consists of diffusiondb (URL and Flickr images, with the inclusion of some low-quality fake images. For the test dataset, we took measures to construct it in a manner that closely resembles real-world scenarios involving image misuse. We utilized multiple generative AI models, fine-tuned on diverse photorealistic datasets, and applied negative prompt keywords like 'cartoon' and 'too many fingers' to generate realistic images.
We hope this dataset encourages the development of robust solutions and stimulates discussions on tackling the challenges associated with AI-generated fake images.
Best Regards,
AI CONNECT
|
[
"# Background\nAs the advancement of generative AI technology has enabled the easy creation of indistinguishable fake information from genuine content, concerns regarding its misuse have surfaced. Image generation AI, in particular, has raised significant alarm due to its potential risks such as identity theft, revenge porn, and political manipulation. In response, it has become imperative to develop technologies that can effectively discern between real and AI-generated fake images.\n\nThe training dataset consists of diffusiondb (URL and Flickr images, with the inclusion of some low-quality fake images. For the test dataset, we took measures to construct it in a manner that closely resembles real-world scenarios involving image misuse. We utilized multiple generative AI models, fine-tuned on diverse photorealistic datasets, and applied negative prompt keywords like 'cartoon' and 'too many fingers' to generate realistic images.\n\nWe hope this dataset encourages the development of robust solutions and stimulates discussions on tackling the challenges associated with AI-generated fake images. \n\nBest Regards,\nAI CONNECT"
] |
[
"TAGS\n#task_categories-image-classification #language-English #license-apache-2.0 #region-us \n",
"# Background\nAs the advancement of generative AI technology has enabled the easy creation of indistinguishable fake information from genuine content, concerns regarding its misuse have surfaced. Image generation AI, in particular, has raised significant alarm due to its potential risks such as identity theft, revenge porn, and political manipulation. In response, it has become imperative to develop technologies that can effectively discern between real and AI-generated fake images.\n\nThe training dataset consists of diffusiondb (URL and Flickr images, with the inclusion of some low-quality fake images. For the test dataset, we took measures to construct it in a manner that closely resembles real-world scenarios involving image misuse. We utilized multiple generative AI models, fine-tuned on diverse photorealistic datasets, and applied negative prompt keywords like 'cartoon' and 'too many fingers' to generate realistic images.\n\nWe hope this dataset encourages the development of robust solutions and stimulates discussions on tackling the challenges associated with AI-generated fake images. \n\nBest Regards,\nAI CONNECT"
] |
[
29,
243
] |
[
"passage: TAGS\n#task_categories-image-classification #language-English #license-apache-2.0 #region-us \n# Background\nAs the advancement of generative AI technology has enabled the easy creation of indistinguishable fake information from genuine content, concerns regarding its misuse have surfaced. Image generation AI, in particular, has raised significant alarm due to its potential risks such as identity theft, revenge porn, and political manipulation. In response, it has become imperative to develop technologies that can effectively discern between real and AI-generated fake images.\n\nThe training dataset consists of diffusiondb (URL and Flickr images, with the inclusion of some low-quality fake images. For the test dataset, we took measures to construct it in a manner that closely resembles real-world scenarios involving image misuse. We utilized multiple generative AI models, fine-tuned on diverse photorealistic datasets, and applied negative prompt keywords like 'cartoon' and 'too many fingers' to generate realistic images.\n\nWe hope this dataset encourages the development of robust solutions and stimulates discussions on tackling the challenges associated with AI-generated fake images. \n\nBest Regards,\nAI CONNECT"
] |
330a2d0e0b87019d4976ec89f75c81b52d435c76
|
# Dataset Card for "fw_num_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
tyzhu/fw_num_train_10000_eval_100
|
[
"region:us"
] |
2023-08-18T05:08:59+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1318323, "num_examples": 20100}, {"name": "eval_find_word", "num_bytes": 4823, "num_examples": 100}], "download_size": 510406, "dataset_size": 1323146}}
|
2023-08-18T05:22:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "fw_num_train_10000_eval_100"
More Information needed
|
[
"# Dataset Card for \"fw_num_train_10000_eval_100\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"fw_num_train_10000_eval_100\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"fw_num_train_10000_eval_100\"\n\nMore Information needed"
] |
59d94d799f3afe6a072b0c774ba7a09f513ad3f0
|
# Dataset of soga_no_tojiko/蘇我屠自古/소가노토지코 (Touhou)
This is the dataset of soga_no_tojiko/蘇我屠自古/소가노토지코 (Touhou), containing 500 images and their tags.
The core tags of this character are `green_hair, hat, short_hair, green_eyes, ghost_tail, black_headwear`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 524.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soga_no_tojiko_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 346.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soga_no_tojiko_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1091 | 674.59 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soga_no_tojiko_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 481.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soga_no_tojiko_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1091 | 884.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/soga_no_tojiko_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/soga_no_tojiko_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, green_dress, solo, tate_eboshi, long_sleeves, electricity, smile, open_mouth |
| 1 | 6 |  |  |  |  |  | 1girl, green_dress, long_sleeves, looking_at_viewer, solo, tate_eboshi, large_breasts, electricity |
| 2 | 9 |  |  |  |  |  | 1girl, cross-laced_clothes, green_dress, long_sleeves, looking_at_viewer, ofuda_on_clothes, solo, tate_eboshi, bangs, closed_mouth, smile, breasts, electricity, frilled_sleeves, simple_background |
| 3 | 5 |  |  |  |  |  | 1girl, green_dress, juliet_sleeves, looking_at_viewer, simple_background, solo, tate_eboshi, upper_body, white_background, bangs, cross-laced_clothes, open_mouth, smile, closed_mouth, medium_breasts |
| 4 | 7 |  |  |  |  |  | bangs, blush, cowboy_shot, green_dress, hat_ribbon, looking_at_viewer, marker_(medium), medium_hair, ofuda_on_clothes, tate_eboshi, hair_between_eyes, sample_watermark, smile, wide_sleeves, 1girl, black_ribbon, blue_background, bowtie, closed_mouth, 2girls, frilled_dress, frilled_sleeves, juliet_sleeves, open_mouth, purple_bow, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | green_dress | solo | tate_eboshi | long_sleeves | electricity | smile | open_mouth | looking_at_viewer | large_breasts | cross-laced_clothes | ofuda_on_clothes | bangs | closed_mouth | breasts | frilled_sleeves | simple_background | juliet_sleeves | upper_body | white_background | medium_breasts | blush | cowboy_shot | hat_ribbon | marker_(medium) | medium_hair | hair_between_eyes | sample_watermark | wide_sleeves | black_ribbon | blue_background | bowtie | 2girls | frilled_dress | purple_bow | solo_focus |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------|:-------|:--------------|:---------------|:--------------|:--------|:-------------|:--------------------|:----------------|:----------------------|:-------------------|:--------|:---------------|:----------|:------------------|:--------------------|:-----------------|:-------------|:-------------------|:-----------------|:--------|:--------------|:-------------|:------------------|:--------------|:--------------------|:-------------------|:---------------|:---------------|:------------------|:---------|:---------|:----------------|:-------------|:-------------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | X | X | | | X | X | X | | X | | X | X | | | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 7 |  |  |  |  |  | X | X | | X | | | X | X | X | | | X | X | X | | X | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/soga_no_tojiko_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T05:26:43+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T19:21:22+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of soga\_no\_tojiko/蘇我屠自古/소가노토지코 (Touhou)
=================================================
This is the dataset of soga\_no\_tojiko/蘇我屠自古/소가노토지코 (Touhou), containing 500 images and their tags.
The core tags of this character are 'green\_hair, hat, short\_hair, green\_eyes, ghost\_tail, black\_headwear', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
451a96f5bc960ba1ac315a47e1ca8d450fbbd376
|
<a href="https://www.healthnews360.org/hu/depanten-gel-velemenyek/">Depanten Átverés</a>:-Az ízületi fájdalom az egyik olyan vegyület, amelyet kannabinoidoknak is neveznek, és amelyek megtalálhatók a kannabisznövényben. Ez azonban nagyon különbözik a THC-tól, amely a marihuánában található pszichoaktív komponens. A kannabidiol egészségügyi előnyei a szorongás vagy stressz enyhítésétől, a fizikai fájdalmak enyhítésétől, az agy egészségesebbé tételétől, az anyagcsere támogatásától és még sok mástól függenek. A cég gondoskodik arról, hogy kannabidiol termékei olyan formában legyenek, amely rengeteg hasznos előnnyel rendelkezik a szervezet számára.
RENDELJE MEG MOST
https://www.healthnews360.org/hu/depanten-gel-velemenyek/
KERESŐ OLDAL
https://www.fitprodiet.com/xtrazex-avis-medical/
https://www.fitprodiet.com/looper3-erfahrungen/
https://www.fitprodiet.com/rhino-gold-gel-avis/
https://www.fitprodiet.com/peoples-keto-gummies-avis/
LEGÚJABB KERESÉS:-
https://sites.google.com/view/depantenglvlemnyek/
https://sites.google.com/view/depanten-tvers/
https://medium.com/@depanten.gel/depanten-gel-a-hat%C3%A9kony-f%C3%A1jdalomcsillap%C3%ADt%C3%B3-34e11bf4a4f3
https://medium.com/@depanten.gel/depanten-g%C3%A9l-a-hat%C3%A9kony-gyullad%C3%A1scs%C3%B6kkent%C5%91-k%C3%A9sz%C3%ADtm%C3%A9ny-57f171480dc7
https://depantengel.blogspot.com/2023/08/a-depanten-gel-vegso-utmutatoja.html
https://depantengel.blogspot.com/2023/08/depanten-gel-hatekony.html
https://www.apsense.com/page/a-depanten-gel-vgs-tmutatja-elfogulatlan-vlemnyek-
https://www.apsense.com/article/hol-kaphat-depanten-gl-tfog-ttekints-errl-a-csodatermkrl.html
https://groups.google.com/g/depanten-gl-vlemnyek/c/a-iR29AJ7LM
https://groups.google.com/g/depanten-gl-vlemnyek/c/Nr7aV3Wnlck
https://www.scoop.it/topic/depanten-gel-velemenyek/p/4146419708/2023/08/16/mi-is-az-a-depanten-gel-depanten-gel-velemenyek
https://www.scoop.it/topic/depanten-gel-velemenyek/p/4146419860/2023/08/16/depanten-gel-velemenyek-depanten-atveres-depanten-ar
https://youtu.be/CVtZCXjtkDc
https://twitter.com/DepantenV/status/1691712656053596634
https://twitter.com/DepantenV
https://in.pinterest.com/pin/905434700067418967
https://vimeo.com/854967600
https://depanten-gel.jimdosite.com/
https://soundcloud.com/depantengel/mi-is-az-a-depanten-gel
https://soundcloud.com/depantengel/depanten-gel-velemenyekdepanten-atveresdepanten-ar
https://community.weddingwire.in/forum/depanten-gel-velemenyek-depanten-ar--t133356
https://community.weddingwire.in/forum/a-depanten-gel-vegs-utmutatoja-elfogulatlan-velemenyek-es-bennfentes-tippek--t133362
|
depantengel/depantenatveres
|
[
"region:us"
] |
2023-08-18T05:37:52+00:00
|
{}
|
2023-08-18T05:38:57+00:00
|
[] |
[] |
TAGS
#region-us
|
<a href="URL Átverés</a>:-Az ízületi fájdalom az egyik olyan vegyület, amelyet kannabinoidoknak is neveznek, és amelyek megtalálhatók a kannabisznövényben. Ez azonban nagyon különbözik a THC-tól, amely a marihuánában található pszichoaktív komponens. A kannabidiol egészségügyi előnyei a szorongás vagy stressz enyhítésétől, a fizikai fájdalmak enyhítésétől, az agy egészségesebbé tételétől, az anyagcsere támogatásától és még sok mástól függenek. A cég gondoskodik arról, hogy kannabidiol termékei olyan formában legyenek, amely rengeteg hasznos előnnyel rendelkezik a szervezet számára.
RENDELJE MEG MOST
URL
KERESŐ OLDAL
URL
URL
URL
URL
LEGÚJABB KERESÉS:-
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
URL
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
9aa3d8b56cb94ab8f34074ad40d3e3ae70848759
|
# Dataset Card for "fairness_mechanic_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
CVasNLPExperiments/fairness_mechanic_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800
|
[
"region:us"
] |
2023-08-18T06:14:33+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "prompt", "dtype": "string"}, {"name": "true_label", "dtype": "string"}, {"name": "scores", "sequence": "float64"}, {"name": "prediction", "dtype": "string"}], "splits": [{"name": "fewshot_0__Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_clip_tags_LAION_ViT_H_14_2B_simple_specific_rices", "num_bytes": 2463238, "num_examples": 4800}], "download_size": 182959, "dataset_size": 2463238}}
|
2023-08-18T06:14:38+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "fairness_mechanic_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800"
More Information needed
|
[
"# Dataset Card for \"fairness_mechanic_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"fairness_mechanic_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800\"\n\nMore Information needed"
] |
[
6,
41
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"fairness_mechanic_google_flan_t5_xxl_mode_T_SPECIFIC_A_ns_4800\"\n\nMore Information needed"
] |
6659f262dd9db0fcf49274dc834ac78b2f21c4a7
|
# Dataset Card for "hh-rrhf-dahoas-gptj-rm-25k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
philschmid/hh-rrhf-dahoas-gptj-rm-25k
|
[
"region:us"
] |
2023-08-18T06:15:03+00:00
|
{"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "responses", "sequence": "string"}, {"name": "scores", "sequence": "float64"}], "splits": [{"name": "train", "num_bytes": 21973591, "num_examples": 24983}], "download_size": 12522534, "dataset_size": 21973591}}
|
2023-08-18T06:15:04+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "hh-rrhf-dahoas-gptj-rm-25k"
More Information needed
|
[
"# Dataset Card for \"hh-rrhf-dahoas-gptj-rm-25k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"hh-rrhf-dahoas-gptj-rm-25k\"\n\nMore Information needed"
] |
[
6,
28
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"hh-rrhf-dahoas-gptj-rm-25k\"\n\nMore Information needed"
] |
de10cd4cc8dd7f42ff330bf2ff91ebb004c9f8e2
|
# Dataset of kumoi_ichirin/雲居一輪 (Touhou)
This is the dataset of kumoi_ichirin/雲居一輪 (Touhou), containing 500 images and their tags.
The core tags of this character are `blue_hair, blue_eyes, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 410.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumoi_ichirin_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 301.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumoi_ichirin_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 889 | 510.76 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumoi_ichirin_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 386.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumoi_ichirin_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 889 | 626.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kumoi_ichirin_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kumoi_ichirin_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, kesa, long_sleeves, white_dress, wide_sleeves, holding, hood_up, looking_at_viewer, simple_background, white_background, hoop, open_mouth, ring, smile, solo, bangs, full_body, one-hour_drawing_challenge, pendant, socks |
| 1 | 11 |  |  |  |  |  | 1girl, hood, hoop, kesa, long_sleeves, pendant, ring, wide_sleeves, smile, solo, open_mouth, dress, cloud, looking_at_viewer, necklace |
| 2 | 9 |  |  |  |  |  | 1girl, hood, kesa, long_sleeves, pendant, ring, wide_sleeves, dress, hoop, smile, 1boy, open_mouth, cloud |
| 3 | 13 |  |  |  |  |  | 1girl, hood, ring, solo, open_mouth, dress, smile, cloud |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | kesa | long_sleeves | white_dress | wide_sleeves | holding | hood_up | looking_at_viewer | simple_background | white_background | hoop | open_mouth | ring | smile | solo | bangs | full_body | one-hour_drawing_challenge | pendant | socks | hood | dress | cloud | necklace | 1boy |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:--------------|:---------------|:----------|:----------|:--------------------|:--------------------|:-------------------|:-------|:-------------|:-------|:--------|:-------|:--------|:------------|:-----------------------------|:----------|:--------|:-------|:--------|:--------|:-----------|:-------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | | X | | | X | | | X | X | X | X | X | | | | X | | X | X | X | X | |
| 2 | 9 |  |  |  |  |  | X | X | X | | X | | | | | | X | X | X | X | | | | | X | | X | X | X | | X |
| 3 | 13 |  |  |  |  |  | X | | | | | | | | | | | X | X | X | X | | | | | | X | X | X | | |
|
CyberHarem/kumoi_ichirin_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T06:15:41+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T20:45:11+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of kumoi\_ichirin/雲居一輪 (Touhou)
=======================================
This is the dataset of kumoi\_ichirin/雲居一輪 (Touhou), containing 500 images and their tags.
The core tags of this character are 'blue\_hair, blue\_eyes, short\_hair', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
5cf3af20bed44d5b07371453bc5fb7d0cceb1708
|
# Dataset of imaizumi_kagerou/今泉影狼/이마이즈미카게로 (Touhou)
This is the dataset of imaizumi_kagerou/今泉影狼/이마이즈미카게로 (Touhou), containing 500 images and their tags.
The core tags of this character are `animal_ears, long_hair, wolf_ears, brown_hair, red_eyes, breasts, tail, wolf_tail, bangs, large_breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 632.33 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imaizumi_kagerou_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 369.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imaizumi_kagerou_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1200 | 769.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imaizumi_kagerou_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 564.36 MiB | [Download](https://huggingface.co/datasets/CyberHarem/imaizumi_kagerou_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1200 | 1.05 GiB | [Download](https://huggingface.co/datasets/CyberHarem/imaizumi_kagerou_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/imaizumi_kagerou_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, brooch, long_sleeves, looking_at_viewer, solo, upper_body, white_background, bare_shoulders, off-shoulder_dress, simple_background, smile, white_dress, blush, collarbone, cleavage, medium_breasts, wide_sleeves |
| 1 | 7 |  |  |  |  |  | 1girl, brooch, dress, fang, long_sleeves, open_mouth, solo, white_background, wide_sleeves, looking_at_viewer, simple_background, blush, collarbone |
| 2 | 14 |  |  |  |  |  | 1girl, brooch, dress, long_sleeves, solo, wide_sleeves, long_fingernails, nail_polish, looking_at_viewer, smile, open_mouth, red_nails, bare_shoulders, fangs, very_long_hair |
| 3 | 7 |  |  |  |  |  | 1girl, brooch, long_fingernails, long_sleeves, red_nails, sharp_fingernails, solo, white_dress, wide_sleeves, blush, collarbone, looking_at_viewer, nail_polish, open_mouth, simple_background, bare_shoulders, fang, off-shoulder_dress, upper_body |
| 4 | 12 |  |  |  |  |  | 1girl, brooch, dress, long_sleeves, solo, bamboo_forest, full_moon, looking_at_viewer, wide_sleeves, open_mouth, smile, very_long_hair, long_fingernails |
| 5 | 8 |  |  |  |  |  | 1girl, blush, looking_at_viewer, navel, simple_background, solo, white_background, animal_ear_fluff, black_bikini, cleavage, collarbone, cowboy_shot, side-tie_bikini_bottom, closed_mouth, smile, standing, bare_shoulders, groin, stomach, wolf_girl, hair_between_eyes |
| 6 | 6 |  |  |  |  |  | 1girl, animal_ear_fluff, brooch, closed_mouth, red_skirt, solo, looking_at_viewer, white_shirt, bare_shoulders, collarbone, simple_background, sleeveless_shirt, standing, very_long_hair, white_panties, bow, embarrassed, lifted_by_self, nose_blush, pantyshot, pleated_skirt, skirt_lift, wavy_mouth, wolf_girl |
| 7 | 6 |  |  |  |  |  | 1girl, blush, female_pubic_hair, looking_at_viewer, navel, nipples, solo, sweat, wolf_girl, completely_nude, open_mouth, smile, armpits, cowboy_shot, hair_between_eyes, simple_background, very_long_hair |
| 8 | 14 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, open_mouth, sex, solo_focus, vaginal, penis, sweat, mosaic_censoring, looking_at_viewer, navel, pov, cowgirl_position, girl_on_top, wolf_girl, animal_ear_fluff, cum_in_pussy, female_pubic_hair, completely_nude, hair_between_eyes, on_back, white_dress |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | brooch | long_sleeves | looking_at_viewer | solo | upper_body | white_background | bare_shoulders | off-shoulder_dress | simple_background | smile | white_dress | blush | collarbone | cleavage | medium_breasts | wide_sleeves | dress | fang | open_mouth | long_fingernails | nail_polish | red_nails | fangs | very_long_hair | sharp_fingernails | bamboo_forest | full_moon | navel | animal_ear_fluff | black_bikini | cowboy_shot | side-tie_bikini_bottom | closed_mouth | standing | groin | stomach | wolf_girl | hair_between_eyes | red_skirt | white_shirt | sleeveless_shirt | white_panties | bow | embarrassed | lifted_by_self | nose_blush | pantyshot | pleated_skirt | skirt_lift | wavy_mouth | female_pubic_hair | nipples | sweat | completely_nude | armpits | 1boy | hetero | sex | solo_focus | vaginal | penis | mosaic_censoring | pov | cowgirl_position | girl_on_top | cum_in_pussy | on_back |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------|:---------------|:--------------------|:-------|:-------------|:-------------------|:-----------------|:---------------------|:--------------------|:--------|:--------------|:--------|:-------------|:-----------|:-----------------|:---------------|:--------|:-------|:-------------|:-------------------|:--------------|:------------|:--------|:-----------------|:--------------------|:----------------|:------------|:--------|:-------------------|:---------------|:--------------|:-------------------------|:---------------|:-----------|:--------|:----------|:------------|:--------------------|:------------|:--------------|:-------------------|:----------------|:------|:--------------|:-----------------|:-------------|:------------|:----------------|:-------------|:-------------|:--------------------|:----------|:--------|:------------------|:----------|:-------|:---------|:------|:-------------|:----------|:--------|:-------------------|:------|:-------------------|:--------------|:---------------|:----------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | X | X | | X | | | X | | | X | X | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | X | X | | | X | | | X | | | | | | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | X | X | X | X | | X | X | X | | X | X | X | | | X | | X | X | X | X | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | | | | X | X | | X | X | | | | X | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | X | | | X | X | | X | X | | X | X | | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 6 |  |  |  |  |  | X | X | | X | X | | | X | | X | | | | X | | | | | | | | | | | X | | | | | X | | | | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | X | X | | | | | X | X | | X | | | | | | | X | | | | | X | | | | X | | | X | | | | | | X | X | | | | | | | | | | | | | X | X | X | X | X | | | | | | | | | | | | |
| 8 | 14 |  |  |  |  |  | X | | | X | | | | | | | | X | X | | | | | | | X | | | | | | | | | X | X | | | | | | | | X | X | | | | | | | | | | | | | X | X | X | X | | X | X | X | X | X | X | X | X | X | X | X | X |
|
CyberHarem/imaizumi_kagerou_touhou
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-08-18T06:17:08+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2024-01-14T16:15:06+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of imaizumi\_kagerou/今泉影狼/이마이즈미카게로 (Touhou)
===================================================
This is the dataset of imaizumi\_kagerou/今泉影狼/이마이즈미카게로 (Touhou), containing 500 images and their tags.
The core tags of this character are 'animal\_ears, long\_hair, wolf\_ears, brown\_hair, red\_eyes, breasts, tail, wolf\_tail, bangs, large\_breasts', which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
List of Packages
----------------
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code
List of Clusters
----------------
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
### Table Version
|
[
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n",
"### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.",
"### Raw Text Version",
"### Table Version"
] |
[
44,
61,
5,
4
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n### Load Raw Dataset with Waifuc\n\n\nWe provide raw dataset (including tagged images) for waifuc loading. If you need this, just run the following code\n\n\nList of Clusters\n----------------\n\n\nList of tag clustering result, maybe some outfits can be mined here.### Raw Text Version### Table Version"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.